Tesla Adds Required Safety Test to Use Full Self-Driving

Following a string of regulators and Tesla vehicle owners expressing concern about the Autopilot and Full Self-Driving software, Tesla is adding a driving safety test to Full Self-Driving. Drivers will be required to pass a seven-day driver evaluation test in order to use the software. This requirement is being rolled out with FSD Beta Version 10.1, which the company says also includes improvements over previous versions like Version 9.2, which even Elon Musk admitted was embarrassingly full of bugs.

The United States’ NHTSA has especially expressed concern about the safety of Tesla’s driver assist programs after a few accidents involving a Tesla vehicle with Autopilot activated and an emergency vehicle. Members of Congress have also requested that the FTC to investigate Tesla’s claims about the Autopilot, which even some of Tesla’s software engineers have admitted are exaggerated in communications with California’s DMV.

Elon Musk did recently admit that he had Tesla prioritize development for Autopilot after an accident involving a Tesla vehicle that killed a bicyclist. According to court documents in a lawsuit filed against Tesla, the driver admitted to having fallen asleep at the wheel. The case was, of course, quickly dismissed, but especially highlighted statistics compiled by the NHTSA itself showing that driver error is a factor in an estimated 94% of vehicle crashes.

Tesla’s Autopilot has since demonstrated the capacity to safely pull over to the side of the road in a real-world driving environment if it detects that the driver has lost consciousness. Tesla’s internal sensors can sometimes be fooled with strategic placement of a few weights on the driver’s seat and the wheel, but it recently activated a camera mounted to the rear-view mirror in order to track driver alertness (and, of course, detect whether the driver is even in the front seat).

Musk did recently admit that developing fully capable self-driving software is more difficult than he had initially thought. For this reason, Tesla has repeatedly issued warnings that drivers should not become complacent when using its driver assist programs. However, Musk seems unwilling to give up on the idea of bringing Autopilot, Full Self-Driving, or both up to Level 5 on the SAE’s scale of autonomous vehicles. This would require the software to be able to handle most driving situations and require user input no more than once every million miles driven.

The new driver evaluation requirement will add a new layer of safety (and possibly reduce Tesla’s exposure to liability) by reminding drivers to avoid becoming complacent while using Full Self-Driving. The test includes a measure of variables the frequency of hard braking and aggressive turning, which can be measured by sensors in the vehicle. According to updated information on Tesla’s website, the data will be compiled into a “safety score” that measures the likelihood that driver behavior will contribute to a future collision.

Tesla has not yet said whether consumers can lose access to Full Self-Driving if their safety score falls below a certain level after the initial seven-day evaluation. Although Musk has stated that his goal is to get Full Self-Driving to the point where it could manage a fleet of autonomous “Robotaxis,” that is still likely to be quite a way off.