Keen Security Lab, a security research team who are focusing cloud computing, has released a new report about Tesla’s Autopilot system and how it could be tricked with misleading visual inputs.
At first, they simply tried to trick the automatic wipers, which are powered by Autopilot’s computer vision system and cameras, by showing images of water to the front-facing camera – triggering the system.
Tesla’s lane recognition system was tricked by placing stickers on the road.
“Based on the research, we proved that by placing interference stickers on the road, the Autopilot system will capture these information and make an abnormal judgement, which causes the vehicle to enter into the reverse lane.”
Tesla replied regarding the trial:
“In this demonstration the researchers adjusted the physical environment around the vehicle to make the car behave differently when Autopilot is in use. This is not a real-world concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should be prepared to do so at all times.”