Home → News → Split-Second 'Phantom' Images Can Fool Tesla's Autopilot → Full Text

Split-Second 'Phantom' Images Can Fool Tesla's Autopilot

By Wired

October 14, 2020

[article image]


Researchers at Israel's Ben Gurion University of the Negev (BGU) found they could fool Tesla's Autopilot driver-assistance systems into automatically reacting without warning by flashing split-second images of phantom road signs on an Internet-connected billboard's video.

BGU's Yisroel Mirsky said, "The attacker just shines an image of something on the road or injects a few frames into a digital billboard, and the car will apply the brakes or possibly swerve, and that's dangerous."

The team injected frames of a phantom stop sign on digital billboards, which tricked a Tesla upgraded to the HW3 version of Autopilot, as well as a Mobileye 630 device.

In an email to the researchers, Tesla said its Autopilot feature should not be considered a fully autonomous driving system, but "a driver assistance feature that is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time."

From Wired
View Full Article

 

Abstracts Copyright © 2020 SmithBucklin, Washington, DC, USA

0 Comments

No entries found