News
News
McAfee researchers modify the sign of speed limit to easily trick Tesla's Autonomous Driving car from 35mph to 85mph 2020-02-20



Researchers stuck a piece of electrical insulating tape on the speed limit sign, which

successfully cheated Tesla's electric vehicle to accelerate. It reveals various potential

loopholes in the current autonomous driving systems.


According to McAfee's research report, technicians pasted tape across the middle of

the number of "3" on the speed limit sign of "35" mph, which led to the misjudgment

of the speed limit of "85" mph, and then the vehicle's cruise control system accelerated

automatically.




McAfee said that the problem was not a serious risk for motorist, no one was injured

during the experiment, and the researchers who drove the car slowed down safely. But

during 18 months research last year, the results shows that autonomous machine

learning systems have weaknesses. According to Steve Povolny, director of McAfee's

Advanced Threat Research lab, other studies have also shown how changes in the

physical world can easily confuse systems.


McAfee said: such tests are not common in the real world


These tests include the 2016 models s and X, which are using camera systems from

Mobileye (now an Intel business unit).


Although Tesla stopped using the Mobileye system in 2016, several automakers

continued to use it.


Mobileye's latest camera system test did not show same fault, and Tesla's latest model

no longer relies on only identifying traffic signs, McAfee said. Tesla did not respond to

any comment on this study.


"Manufacturers and suppliers have been informed of this problem and have further

understood and improved it, " Povolny said. "But this does not change the fact that

there are many blind spots in the autonomous driving industry.


To be sure, the threat to the real world is limited. First of all, autonomous driving is

still in the developed stage,most will be tested when the driver can take over the

steering wheel safely any time. Although many vehicles support ADAS now, they still

need to pay attention to the road conditions.


McAfee researchers can only cheat the system by copying a series of specific events,

including turning on driving AIDS and encountering tampered speed limit signs.


Manufacturers also integrate mapping technology into systems that respond to

appropriate speed limits. "Until we have a real autonomous driving car, it is unlikely

to see the above situation in the real world, or the attacker tries to exploit these

loopholes. At that time, we hope that such defects can be solved as soon as possible.

"Povolny said.


Even driving can be fooled by such changes; the system that researchers tested was

originally designed to aid driving, not to support autonomous driving, Mobileye said.


In addition to sensing, autonomous driving needs to improve safety and

fault tolerance through various technologies


"Autonomous driving technology not only relies on sensing, but also receives various

other technologies and data support, such as crowd sourced mapping. It’s for ensure

the reliability of received information from camera sensors and provide stronger backup

and security. "McAfee said.


McAfee's research follows the academic work which similar to adversarial machine

learning. It belongs to a relatively new research field, specializing in how to operate

computer-based learning systems. In 2017, researchers found that putting four black

and white stickers on a specific location of a stop sign could trick computer vision into

mistaking it for a 45 mph speed limit sign.


Missy Cummings, a professor of robotics and autonomous driving expert at Duke

University, said the problem is not unique to Tesla or Mobileye, but a common inherent

safety weakness of advanced autonomous driving systems. Researchers have shown

that changing the physical environment without accessing the system itself can lead to

severe failure conditions potentially.


"That's why it's so dangerous, because you don't need to crack through the access

system, you just need to exist in our world. "she said.


Cummings said McAfee's findings indicate why autonomous driving should be had  a

"visual test" to assess whether the automated driving system can safely detect the

surroundings and deal with real-world conditions like other constructed vehicles,

pedestrians and other passerby. Safety advocates have urged U.S. car safety regulators

and lawmakers to include the assessment in other elements of the new autonomous

driving rules.

                                                                                 ——Sourcetechnews

leave a message welcome to fic
If you have any problem when using the website or our products, please write down your comments or suggestions, we will answer your questions as soon as possible!Thank you for your attention!

Home

Solutions

about

contact