The Air Force has invented a 3-D, stereoscopic vision system that fuses visible, infrared and multispectral images from multiple cameras for maneuvering unmanned or autonomous vehicles. The technology is available to innovative businesses for development of new products or services.
While 3-D displays are rapidly becoming a popular entertainment medium, their use in real-world settings is not so well defined. Incorporating 3-D technology into the operation of drones may be one of the early applications outside of gaming. A benefit of using a 3-D display with autonomous or remotely piloted vehicles is to allow operators of those vehicles to aggressively maneuver in a complex or dynamic environment. This may also be beneficial in piloted vehicles, providing those operators with additional visual information related to the environment through which the driver and vehicle are traversing. Stereoscopic cameras that could capture images for display on a 3-D display are well known; however, these cameras alone do not provide any additional information that would not have already been available to the operator. An improvement would be to capture additional data from across the spectrum.
The Air Force is on the cutting edge of research in the area of remotely piloted vehicles and towards that effort, their scientists and engineers have developed a system for providing multispectral stereoscopic images that may be employed on a moving platform.
In this bio-inspired approach based on human visual perception scientific research, information from color low-light-level (LLL), short-wave infrared (SWIR), long-wave infrared (LWIR, thermal), solid-solid state sensors and cameras are algorithmically fused to yield high-information 3-D video images in real time.
Using sensors that differ in their spectral sensitivities allows the display of scene information that is not normally seen by the unaided human eye. A visible spectrum, LLL TV camera can sense colors under very low ambient lighting conditions. NIR can render night-time scenes that are illuminated only by starlight. SWIR utilizes energy that is given off by the sun’s solar wind hitting the Earth’s ionosphere. It is available over the entire surface of the globe. SWIR can penetrate atmospheric haze and can work with eye-safe laser pointers. MWIR and LWIR are sensitive to temperature differentials and readily show hot objects such as humans, fired weapons, and vehicles. NIR, SWIR, MWIR, and LWIR can be used to break camouflage.
This technology could be embedded in the grille of an autonomous vehicle in order to provide enhanced safety through additional awareness. License fees are negotiable and TechLink provides no-cost licensing assistance.
- Incorporates multispectral imaging data into 3-D video
- Number and type of sensor selection is a function of the imaging goals (surveillance, target detection, weapons aiming, aerial reconnaissance, 3-D mapping, navigation, underwater observation, space object analysis)
- System output can be converted and displayed on most any of the multitudes of available 3-D display technologies (anaglyphic, lenticular, holographic, swept volume, shuttered/polarized)
- US patent 9,948,914 available for license
- Potential for collaboration with Air Force researchers