U.S. President Joe Biden appointed Duke University engineering professor Missy Cummings to a senior advisory position for the National Highway Traffic Safety Administration. Cumming studies autonomous systems and has been especially critical of Tesla’s driver assist programs, Autopilot and Full Self-Driving (FSD).
This sparked considerable backlash among Tesla’s supporters, who referred to Cummings as “the hired gun of the Luddites of mediocrity” in the below YouTube video. A petition calling for the review of her appointment had almost 24,000 signatures as of October 23, 2021.
Cummings was especially accused of a potential conflict of interest since she serves on the Swedish driver-assist company Veoneer’s board of directors. NHTSA spokesperson Lucia Sanchez told CNN Business only that any possible conflicts of interests would be resolved before Cummings starts her duties at the NHTSA.
Elon Musk especially referred to Cummings as “extremely biased,” to which she said only that she would be willing to discuss the issue with him at any time. In the wake of backlash on Twitter, Cummings switched her Twitter account to private, and then deleted it altogether.
Although she says she supports Tesla for the most part, she said in the below interview that Tesla has “problems with Autopilot and Full Self-Driving.”
She said that Autopilot should only be used on the roads for which it is designed. The National Transportation Safety Board is currently investigating a series of crashes that may have involved Tesla vehicles with Autopilot engaged.
On the flip side, Tesla’s driver assist programs are not always at fault when there is a crash involving a Tesla vehicle. An NTSB investigation of a fatal wreck involving a Tesla Model S in April showed that the driver is likely to have been in control at the time of the wreck. Tesla typically recommends that Tesla owners keep their hands on the wheel (or steering yoke, with some model updates) and stay alert while using Autopilot or Full Self-Driving.
Elon Musk also admitted that he had Tesla prioritize the development of Autopilot after one crash involving a Tesla that killed a bicyclist. The driver admitted having fallen asleep at the wheel in court testimony in a case in which he claimed that the vehicle’s “new car smell” lulled him to sleep.
He can point to recent data showing that the use of Autopilot may be a contributing factor in lowering the number of crashes per mile, which may be attractive to drivers who use stretches of highway that are notorious for accidents. Some sections of I-4 in Florida can see some bad accidents and delays due to one or more crashed vehicles, for instance.
However, Cummings said, names like Autopilot and Full Self-Driving can create the illusion that Tesla’s software is more capable than it really is and can lull drivers into a sense of complacency.
“If you’re being told your car has a full self-driving chip and how many of us read the fine print? It’s easy to see how people can develop these incorrect mental models about what their car can do,” she said.
However, this hasn’t necessarily reassured Tesla enthusiasts who oppose any regulations that could prevent Tesla from getting its driver assist programs up to Level 5 on the Society of Automotive Engineers’ (SAE) five-level scale of a vehicle’s autonomy. Level 5 would only require a maximum of one user input for every million miles driven.
Comments