Home Cyber Security Fooling a self-driving car with mirrors on traffic cones • The Register

Fooling a self-driving car with mirrors on traffic cones • The Register

0
Fooling a self-driving car with mirrors on traffic cones • The Register


Mirrors can fool the Light Detection and Ranging (LIDAR) sensors used to guide autonomous vehicles by making them detect objects that don’t exist, or failing to detect actual obstacles.

A team of scientists from France and Germany demonstrated the techniques in a university campus parking lot. They successfully made a car fitted with LIDAR, running the popular Autoware navigation code, to either fail to recognize and attempt to drive through an obstacle, or convince the car to brake unexpectedly to avoid an object that wasn’t there.

LIDAR is used on most self-driving cars – Tesla is the exception – and uses laser pulses to measure the physical environment, but is known to struggle with reflective surfaces. Last year a team of eggheads managed to fool LIDAR with tinfoil and colored swatches.

Self-driving car fooled by a traffic cone

Cone hidden in plain sight – Click to enlarge

The European researchers went one better with a technique they called an Object Removal Attack (ORA) which used mirrors of various sizes to cover a traffic cone. By adjusting the size and position of the mirrors used they could completely mask the obstacle to the LIDAR system, and speculated that the mirrors could obscure the car’s line of sight.

The second attack, Object Addition Attack (OAA), used small mirror tiles to make the car detect an obstacle it needed to avoid. The car, under manual control for safety purposes, spotted the fake obstacle 20 meters away and avoided the fictitious problem.

False positive for self-driving car

No turning left – Click to enlarge

“We show that by exploiting the physics of specular reflection, an adversary can inject phantom obstacles or erase real ones using only inexpensive mirrors,” the researchers wrote in a paper submitted to the journal Computers & Security.

“Experiments on a full AV platform, with commercial-grade LIDAR and the Autoware stack, demonstrate that these are practical threats capable of triggering critical safety failures, such as abrupt emergency braking and failure to yield.”

The researchers tested the system in various scenarios, with the OAA scheme used to deter a car from making a legal turn in traffic by using a variety of mirror selections. Two mirrors produced a 65 percent success rate in convincing the car’s software that a phantom object was in the way, and that rose to 74 percent when a grid of six mirrors was used, and the mirror’s positioning could trigger an emergency stop – not what’s needed in heavy traffic.

Mirrors were even more effective at hiding objects in an ORA attack, as the authors found the reflective surfaces’ positioning hid obstacles from cars regardless mirror angles.

The paper also describes defenses including thermal imaging, as solid objects usually have a signature. The authors warn adding thermal imaging to LIDAR “is not a panacea,” particularly for small objects in high-temperature environments.

The research team also pointed out that they conducted tests at speeds well below those at which cars travel on freeways and therefore note the need for more testing, so we’re a while away from knowing if $100 of mirrors can threaten your next robo-cab ride. ®



Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here