Next-generation 3D camera

Jabil, ams OSRAM, and Artilux combined their proprietary technologies in 3D sensing architecture design, semiconductor lasers, and germanium-silicon (GeSi) sensor arrays.

Photo Courtesy: Jabil Optical Engineering

Ian Blasch (IB), senior director of business development for Jabil’s Optics division, speaks with Aerospace Manufacturing & Design (AM&D) about the creation of the new camera.

AM&D: What is the technology, exactly?

IB: The 3D camera is based on a Time-of-Flight (ToF) sensor (from Artilux) and an illumination board containing 1,130nm VCSELs from ams OSRAM. The solution also has software for processing depth and point cloud information. The key distinction of this ToF solution is that it operates at 1,130nm, a new short-wave infrared (SWIR) wavelength for the industry. Currently, the industry primarily supports ToF solutions at 850nm and 940nm.

AM&D: What brought the three industry partners together to develop this camera?

IB: Jabil identified a pan-industry requirement for a 3D camera that could operate in high ambient light conditions with an operating range up to 20m. The market has many quality 3D sensing solutions for indoor operation up to 5m and many 3D sensing options for longer range automotive applications, but there are few options for outdoor operation up to 20m range. Jabil, with its long history in 3D sensing and strong supplier network, was able to target several key innovations in the industry and combine them into a solution. The collaborative nature of all three companies, working together to showcase a solution, was key.

AM&D: How does the camera’s near-infrared sensitivity improve robotics applications?

IB: By targeting 1,130nm, the camera takes advantage of a gap in the sun’s solar spectrum, lowering the amount of background noise impacting the ToF signal. The Earth’s atmosphere absorbs or reflects photons at the 1,130nm wavelength, reducing the number of photons reaching the surface of the Earth. The magnitude of the signal can also be increased, as laser eye safety thresholds are larger at 1,130nm than at 850nm and 940nm. As a result, the signal-to-noise ratio is dramatically increased enabling the 3D camera to operate in bright sunlight conditions, reach longer ranges, and improve accuracy.

AM&D: What are its implications for the robotics industry?

IB: The key implication is that robotic platforms are no longer restricted to indoor operation. Given the high cost of autonomous mobile robots (AMRs) and other autonomous platforms, designing 3D cameras that are tolerant of ambient light increases their utility, safety, and reliability. Our goal, in collaboration with many component suppliers in the market, is to create low-cost 3D sensors enabling forklifts, AMRs, agricultural equipment, platforms in support of aircraft operations, and more whose performance is indifferent to environmental conditions.

AM&D: How does the SWIR capability offer enhanced eye safety, outstanding performance in high sunlight environments, and skin detection capability, which is of critical importance for collision avoidance when, for example, humans and industrial robots interact?

IB: With respect to eye safety, lasers with wavelengths from 400nm to around 1,300nm travel through the eye’s lens and cornea to reach the retina. If too much laser energy is absorbed by the retina, it can cause permanent damage. As one moves to higher wavelengths, more laser energy can be applied to the scene without violating laser safety thresholds. Therefore, more signal can be applied to a scene at 1,130nm than at 940nm, increasing the signal-to-noise ratio.

The performance in sunlight is addressed in a previous question, but it’s important to note that not all SWIR wavelengths are absorbed by Earth’s atmosphere. There are gaps around 1,130nm and 1,380nm. We’re working with our partners to build solutions to exploit both gaps.

Finally, SWIR is absorbed by water. Human skin contains water, and as a result, it absorbs the active signal from the ToF camera.

AM&D: The 3D sensor data from these innovative depth cameras will improve automated functions such as obstacle identification, collision avoidance, localization, and route planning – key applications necessary for autonomous platforms. Are there any automated guided vehicle applications being considered?

IB: The use of 3D cameras at 1,130nm can also be installed on automated guided vehicles (AGVs), especially for collision avoidance. The use of 1,130nm should be considered for AGVs that operators may want to move along loading bays, between buildings, or operate completely in outdoor environments.

AM&D: Beyond industrial/manufacturing robotics applications, can the technology be used in aerospace drones or advanced air mobility applications?

IB: By working with many partners in the industry, the investment and performance of solutions in the SWIR will continue to improve. These components will be used to create innovative solutions for drone platforms. We see potential applications for autonomous platforms to support loading and unloading of cargo, weapons systems, etc. We’re also hearing about autonomous vehicles being designed to support other airport and runway operations.

Jabil Optical Engineering

About the author: Jake Kauffman is associate editor with Aerospace Manufacturing & Design magazine and can be reached at jkauffman@gie.net.

D&A NEWS

Photo Courtesy: Advanced Navigation

Series B funding raises $68 million for AI company

Global investment firm KKR will lead a $68 million Series B funding round for Advanced Navigation, a developer of artificial intelligence (AI) robotics and navigation technology.

Headquartered in Australia, Advanced Navigation's solutions optimize performance and efficiency by leveraging AI neural networks and deep learning algorithms. Their sensor products are sold into commercial and defense industries for sea, land, air, and space applications. Their AI approach allows solutions to have greater accuracy and reliability while maintaining a small form factor.

Advanced Navigation seeks to use the funding to accelerate research & development (R&D) programs focused on transformative robotic, navigation, photonic, and quantum sensing solutions, and enhance its global sales and marketing capabilities. Advanced Navigation will also assess inorganic growth opportunities to incorporate new technologies and products into its suite which complement its existing products and areas of expertise.

Advanced Navigation
KKR

 

Photo Courtesy: Universal Robots

Universal Robots reaches 1,000 employees

Since its first collaborative robot (cobot) was launched in 2008, Universal Robots (UR) has grown to be a global market leader in cobots with offices in more than 20 countries and now reaching 1,000 employees.

UR started in 2005, when Esben Østergaard, Kasper Støy, and Kristian Kassow from the University of Southern Denmark were frustrated by how the robots of the time were heavy, expensive, and complicated to use. This gave them the idea to create a robot that’s flexible, safe to work with, and easier to install and program.

Since then, UR has developed a range of cobot products, selling more than 50,000 cobots worldwide. During the past year, UR has hired more than 200 employees to ensure the company is ready to realize the enormous growth potential that lies ahead.

Universal Robots
March 2023
Explore the March 2023 Issue

Check out more from this issue and find your next story to read.