Some self-driving cars can’t tell the difference between a person in the roadway and an image projected on the street.

What’s new: A team led by researchers at Israel’s Ben-Gurion University of the Negev used projectors to trick semiautonomous vehicles into detecting people, road signs, and lane markings that didn’t exist.

How it works: The researchers projected images of a body (Elon Musk’s, to be precise) on a street, a speed-limit sign on a tree, and fake lane markings on a road. A Tesla on autopilot and a Renault equipped with Intel Mobileye’s assistive driving system — which rely on sensors like cameras and radars rather than three-dimensional lidar — responded by swerving, stopping, or slowing (as you can see in the lower left-hand corner of the clip above). The paper proposes three convolutional neural networks to determine whether an object is real or illusory.

  • One CNN checks whether the object’s surface texture is realistic, flagging, say, a stop sign projected on bricks.
  • Another checks the object’s brightness to assess whether it reflects ambient, rather than projected, light.
  • The third evaluates whether the object makes sense in context. A stop sign projected on a freeway overpass, for instance, would not.
  • The team validated each model independently, then combined them. The ensemble caught 97.6 percent of phantom objects but mislabelled 2 percent of real objects.

Behind the news: A variety of adversarial attacks have flummoxed self-driving cars. A 2018 study fooled them using specially designed stickers and posters. Another team achieved similar results using optical illusions.

Why it matters: A mischief maker with an image projector could turn automotive features designed for safety into weapons of mass collision.
The companies respond: Both manufacturers dismissed the study, telling the authors:

  • “There was no exploit, no vulnerability, no flaw, and nothing of interest: the road sign recognition system saw an image of a street sign, and this is good enough.” — Mobileye
  • “We cannot provide any comment on the sort of behavior you would experience after doing manual modifications to the internal configuration [by enabling an experimental stop sign recognition feature].” — Tesla

We’re thinking: The notion that someone might cause real-world damage with a projector may seem far-fetched, but the possibility is too grave to ignore. Makers of self-driving systems should take it seriously.

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox