Sign In

Communications of the ACM

ACM TechNews

Split-Second 'Phantom' Images Can Fool Tesla's Autopilot


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
A lot of Teslas.

Researchers found they could fool Tesla's Autopilot driver-assistance systems into automatically reacting without warning/

Credit: Jasper Juinen/Bloomberg/Getty Images

Researchers at Israel's Ben Gurion University of the Negev (BGU) found they could fool Tesla's Autopilot driver-assistance systems into automatically reacting without warning by flashing split-second images of phantom road signs on an Internet-connected billboard's video.

BGU's Yisroel Mirsky said, "The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that's dangerous."

The team injected frames of a phantom stop sign on digital billboards, which tricked a Tesla upgraded to the HW3 version of Autopilot, as well as a Mobileye 630 device.

In an email to the researchers, Tesla said its Autopilot feature should not be considered a fully autonomous driving system, but "a driver assistance feature that is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time."

From Wired
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA


 

No entries found