The United States’ National Highway Traffic Safety Administration (NHTSA) has requested detailed information on the Tesla Autopilot’s ability to recognize a crash scene with emergency vehicles present as part of a probe into accidents that involved a Tesla vehicle with Autopilot engaged crashing into a stopped emergency vehicle with lights activated.
The NHTSA is also seeking information on the Autopilot’s ability to ensure that drivers remain alert, the details of Tesla’s testing of new Autopilot features, marketing materials related to the driver assist program, and all consumer complaints, lawsuits and arbitration cases involving Autopilot. There have been some incidents that indicate that Autopilot can be fooled into thinking that the driver is paying attention, including a couple of cases of a Tesla owner riding in the back seat of a driverless Tesla vehicle. In response to these incidents, Tesla activated a camera mounted to the rearview mirror to improve tracking of user alertness.
Tesla currently conducts a public beta program in which Tesla vehicle owners can help test new features and send driving data back to Tesla in order to help train the AI behind Autopilot and Full Self-Driving. One new feature is the ability to recognize emergency vehicle lights and vehicles’ turn and brake lights.
The probe has expanded to include twelve incidents, including a recent one in which a Tesla vehicle crashed into a Florida Highway Patrol vehicle on an interstate highway near Orlando’s downtown area. Seventeen people were injured and one was killed in these incidents since 2018. The probe covers 765,000 Tesla vehicles in the 2014 to 2021 model years.
Tesla has issued repeated warnings that Autopilot and Full Self-Driving (FSD) are not infallible and drivers should stay alert. These driver assist programs are rated as Level 2 on the Society of Automotive Engineers’ (SAE) five-level scale of a vehicle’s autonomy. At that level, Autopilot and FSD are capable of performing some tasks like lane-keeping and cruise control. Level 5 is full autonomy, with a maximum of one driver input required for every million miles of driving.
Tesla’s developers have also admitted in communications with California’s DMV that Tesla and Musk sometimes overstate the capability of Autopilot and FSD. Musk recently admitted that the development of a Level 5 self-driving car was more difficult than he had originally thought and Tesla is working on fixes for issues with Full Self-Driving Beta 9.2 for release with Beta 9.3.
CEO Elon Musk has previously clashed with regulators due to scrutiny of both Tesla and SpaceX. Previous clashes involve the SEC’s censoring of Musk over a 2018 tweet saying that he could take Tesla private and the occasional tug-of-war over Tesla vehicle recalls, one of which involved a flaw in the onboard computer system that could cause the touchscreen to go blank. In the latter case, Tesla denied that the computer system issue posed a safety risk.
Under the Biden Administration, the NHTSA does seem to be taking a tougher stance on the safety of driver assist programs like Autopilot and FSD. It has previously hesitated to get involved due to the risk of hampering development and adoption of automated driving software. Past studies indicate that driver error is a critical factor in as many as 90% of serious vehicle crashes.
Tesla has until October 22 to respond to the NHTSA’s query. If it doesn’t, it could face a $114 million fine. Tesla does not have a PR department and has characteristically not responded to requests for comment about the NHTSA’s most recent probe.