A projector had far too much fun with car tech

by

Stop it. You can fool a Tesla Autopilot system with a projector?

Really, a projected image of a human is considered by the car system as a real person?

And are you saying Mobileye 630 PRO considers the projected road sign as a real road sign?

These are findings by a team of researchers who showed the kinds of phantom attacks that can occur to trick driver-assistance systems.

The team wrote a paper and delivered a video demo on their experiments and findings. The paper is called "Phantom of the ADAS: Phantom Attacks on Driver-Assistance Systems."

The authors are Ben Nassi, Dudi Nassi, Raz Ben-Netanel, Yisroel Mirsky, Oleg Drokin, and Yuval Elovici. Author affiliations include Ben-Gurion University of the Negev and Georgia Tech. They used the Tesla Model X and the Mobileye 630 PRO systems for testing. They also used a number of projected images; these included a human figure and a street speed sign.

They wanted to know if one can make a system think it is a real world situation—-confusing the system and taking a level of control over the system. "Phantoms can also cause the Tesla Model X (HW 2.5) to brake suddenly."

A video demo showed how the car reduced its speed from 18 mph to 14 mph as a result of a phantom that was detected as a person.