TriEye’s Infrared Camera Helps Autonomous Cars See Through Haze
by Bill HowardThis site may earn affiliate commissions from the links on this page. Terms of use.
The Israeli startup TriEye believes it can move cars closer to self-driving with a short-wave infrared, or SWIR, camera as part of the car’s arsenal of sensors. A SWIR camera can see better than traditional optical cameras when there’s rain, fog, dust, or smoke.
TriEye’s breakthrough isn’t the invention of SWIR technology cameras – they exist today – but in reducing the potential cost by “1,000x,” according to the company. TriEye says lower-resolution SWIR cameras can cost $20,000, so that means TriEye believes a camera, or at least its sensor, can be sold for $20. (Our estimate.) That’s in line with the sub-$100 cost of radar and fixed (not rotating) lidar modules. TriEye says it will have working cameras, called Raven, ready to sample this year.
The company was formed in 2017 by researchers at Hebrew University. Short-wave infrared technologies already have some traction in high-value, often defense-related applications using InGaAs, or indium gallium arsenide. That’s an alloy made from indium arsenide (InAs), gallium arsenide (GaAs), indium phosphide (InP), and gallium phosphide (GaP). But it’s expensive. It’s hard to make a mass-market self-driving or assisted car if the tiny camera in your windshield costs as much as a Corolla.
SWIR cameras work in the wavelength spectrum of 1.0-1.9um (micrometers) where traditional optical cameras operate at 0.4-0.75um. TriEye says its big advance, the one making short-wave infrared commercially viable, is the adaptation of the SWIR technology to use CMOS – complementary metal-oxide semiconductor – technology that is far cheaper.
The TriEye Raven camera would initially be offered with 1280 x 960 resolution, have a field of view of 17 x 12 degrees or 46 x 34 degrees in a 4:3 aspect ratio, and a frame rate of 30fps. Including the lens, the camera measures 3 x 3 x 2.5 cm, or 1.2 x 1.2 x 1.0 inches.
TriEye will target both autonomous cars and today’s cars with advanced driver-assistance systems, or ADAS, technology: adaptive cruise control, lane centering assist, and blind-spot detection. On cars coming out in the next few years, a TriEye camera would be better able to make out pavement markers to keep the car centered in its lane – a requirement of Level 2 and higher autonomy – as well as make out people and animals in or near the roadway. TriEye says its sensor also improves low-light image sensing.
TriEye says radar and lidar, even working together, have issues detecting and identifying objects on the road in murky visibility conditions, while the TriEye SWIR CMOS can see objects in bad visibility situations. How much and how well will await real-world testing.
TriEye is not alone in seeking new tools for autonomous driving. For establishing a car’s exact location on the roadway, the MIT-spinoff WaveSense proposes using ground-penetrating radar to create an exacting map. The soil and rock types, cavities, and utility pipes create a unique image that can locate the car to within a few inches. WaveSense would require initial mapping (and update mapping); cars would carry the map data and have their own downward-facing radars. WaveSense tech would need to work with other technologies to identify other cars, blocked lanes, and pedestrians and animals.
TriEye has raised more than $22 million in funding, not much by Amazon-Apple-Microsoft standards, but enough to develop prototypes to show automakers and suppliers this year. The big win was a large investment in August 2019 by Porsche Investment, both for the money and for the name of the world’s best-known sports car company. Intel Capital provided initial funding; Grove Ventures and Israeli businessman Marius Nacht also were early investors. Porsche’s investment doesn’t mean the camera would be restricted to Porsches.
Now read: