Only Safe Drivers Get Self-Driving Tesla Opens Beta Test of Full Self Driving Feature

Published
Reading time
2 min read
Inside of a Tesla car in motion

Tesla’s autonomous driving capability has inspired hair-raising antics on the road. Now the company is deploying an algorithm to determine whether customers have shown sufficiently sound judgement to use its “Full Self-Driving” software.

What’s new: Starting this week, the beta-test version of Tesla’s latest self-driving update will be available only to drivers who have demonstrated safe driving. The beta program previously was open to about 2,000 drivers.

How it works: Drivers can request the software through a button on their car’s dashboard screen.

  • The car then collects data about five factors: forward collision warnings per 1,000 miles, hard braking, aggressive turning, unsafe following, and forced disengagement of self-driving features when the car determines that drivers aren’t paying attention.
  • Customers who maintain a high safety score for a week will be allowed to use the Full Self-Driving beta. The software will enable Tesla vehicles to autonomously brake for traffic lights and decide when to change lanes.
  • Most drivers have a safety score of 80, which they can view in the Tesla app, the company said. It didn’t specify the score necessary to gain access to the beta.

Behind the news: The engineering association SAE International has graded Tesla’s Full Self-Driving system at Level 2 autonomy, which means it must be supervised constantly by a human driver. National Transportation Safety Board (NTSB) chair Jennifer Homendy recently said that Tesla’s use of the term “full self-driving” is irresponsible and called on the company to address basic safety issues before expanding the test program. The National Highway Traffic Safety Administration, which has the authority to demand recalls, is investigating the culpability of Tesla’s software in 11 accidents.

Why it matters: Self-driving technology is still developing and has not yet been proven safe under the vast variety of circumstances that arise in real-world driving. Most companies that are developing such technology hire safety drivers to test their systems within tightly constrained boundaries. In contrast, Tesla is enrolling the best drivers of Tesla vehicles to test its system on the open road.

We’re thinking: Scoring driver behavior and limiting the distribution of special features only to the safest drivers is a good idea, assuming the score is well designed and implemented. It both ensures that only excellent drivers can use the riskiest features and incentivizes all drivers to do their best. But recruiting customers to test unproven technology is reckless. We urge Tesla, and any company that would consider following its lead, to prove its technology’s safety under controlled conditions before putting the general public at risk. And can we stop calling a great driver assistance system “full self-driving”?

Share

Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox