Tesla has released an important update for the beta version of Full Self-Driving. Beta Version 10 may fix many of the reported issues with Version 9.2 – a version that even Elon Musk admitted was terrible – and also added new features that made it easier to navigate some features of city streets like roundabouts. It also updated driving visualizations that make it possible for Tesla vehicle owners to see what Full Self-Driving “sees”.
Elon Musk also expects that Version 10.1 will be forthcoming shortly:
Tesla considers the beta program to be an important part of improving Full Self-Driving by collecting real-world usage data to help train the AI behind it. The company has spun up a series of powerful computers to store that data and improve the AI. The most powerful of these will be the “Dojo” supercomputer, which is specifically designed for machine learning.
Initial reports from beta testers indicate that Beta Version 10 isn’t without the expected bugs and flaws, however. One driver reported that it approached a parked car faster than he would have liked. Another reported that FSD ran a stop sign. Neither driver intervened in either case, but there were no crashes as a result of the FSD’s errors.
On the flip side, drivers say that the updates made it possible for FSD to navigate a “twisty” street in San Francisco without requiring intervention and also navigate a couple of construction zones without a problem.
Testers have already posted some videos of themselves making use of FSD Beta Version 10 on YouTube. One YouTuber posted a video in which he called FSD “More confident … aggressive” while it drove along.
Tesla has issued repeated warnings that drivers should stay alert when using Autopilot and FSD. Documents related to access to FSD’s Beta program require that testers keep their hands on the wheel and say that they can be kicked from the program if the FSD software determines that they were unalert too many times. It may simply amount to not being a distracted driver even with FSD activated.
Experts say that Full Self-Driving is still at Level 2 on the SAE’s five-level scale of self-driving software. That means it is capable of performing some basic driving tasks, but drivers should remain alert for any unusual, complicated, or unexpected driving scenarios.
Some communications between Tesla developers and state-level regulators did express concern about the possibility that Tesla and Elon Musk might overstate the capabilities of Full Self-Driving. Developers say that the software probably won’t be at Level 5 as soon as Elon Musk seemed to expect. Even Elon Musk admitted that achieving Level 5, or full autonomy, was more difficult than he had expected at first. Even so, he says that Tesla prioritized work on Autopilot due to one incident in which a driver fell asleep at the wheel, leading to a fatal wreck, and then sued Tesla alleging that the “new car smell” making him drowsy.
Tesla handily won that particular lawsuit, though Musk’s concern about the role of driver error in fatal crashes is not unwarranted. Despite occasional clashes between Tesla and the NHTSA, the NHTSA itself once said in a report that driver error was a contributing factor in more than 90% of fatal crashes.
The SAE recently issued a revision clarifying the distinctions between some levels on its chart. The revisions also removed unclear jargon from the document.
Comments