According to federal authorities, at least 467 crashes involving Tesla’s Autopilot technology resulted in at least 13 fatalities and “many others” with significant injuries due to a “critical safety gap.”
The results are based on an examination conducted by the National Highway Traffic Safety Administration on 956 crashes when Tesla Autopilot was suspected to have been used. The almost three-year investigation’s findings were released on Friday.
According to the NHTSA investigation, Tesla’s Autopilot design has “led to foreseeable misuse and avoidable crashes.” The system failed to “Sufficiently ensure driver attention and appropriate use.”
A “weak driver engagement system” and Autopilot that continues to run even when a driver isn’t paying enough attention to the road or their driving task were mentioned in the NHTSA complaint. The driver engagement system has in-cabin cameras that can recognize when a driver is not looking at the road, as well as a variety of reminders, such as “nags” or chimes, to remind drivers to stay focused and keep their hands on the wheel.
The government also said that it was initiating a fresh investigation into the efficacy of a software upgrade that Tesla had previously released as a component of a recall in December. The NHTSA found flaws in Autopilot during this same examination, which was the reason for the upgrade.
Specifically aimed at enhancing driver monitoring systems in Teslas equipped with Autopilot, the voluntary recall was carried out by an over-the-air software update, affecting two million Tesla vehicles in the United States.
Given that there are still being reported crashes connected to Autopilot, the NHTSA indicated in its assessment released on Friday that the software upgrade was most likely insufficient.
In one recent instance, data acquired by CNBC and NBC News show that on April 19, a motorcyclist was struck and killed by a Tesla driver in Snohomish County, Washington. When the crash occurred, the driver admitted to officials that he was operating on Autopilot.
The NHTSA’s findings are the latest in a string of investigations from regulators and watchdogs that have cast doubt on the security of Tesla’s Autopilot system, which the business has positioned as a crucial distinction from other automakers.
According to Tesla’s website, Autopilot uses automatic steering and advanced cruise control to lessen the “workload” on drivers.
In addition to failing to respond to a request for comment submitted to the company’s investor relations team, vice president of vehicle engineering Lars Moravy, and the press, Tesla has not responded to the NHTSA report that was released on Friday.
Sens. Edward J. Markey (D-Mass.) and Richard Blumenthal (D-Conn.) released a statement after the NHTSA study was made public, urging federal authorities to order Tesla to limit its Autopilot function “to the roads it was designed for.”
Among a number of other warnings, Tesla advises drivers not to use the Autosteer feature of Autopilot “in areas where bicyclists or pedestrians may be present” on its Owner’s Manual website.
The senators stated “We urge the agency to take all necessary actions to prevent these vehicles from endangering lives.”
Tesla resolved a lawsuit from the family of Walter Huang, an Apple developer and father of two, earlier this month. Huang passed away in an accident when his Tesla Model X with Autopilot functions turned on and collided with a road barrier. Tesla has made an effort to conceal the settlement’s specifics from the general public.
Elon Musk, the CEO of Tesla, indicated this week that the firm is placing its future on autonomous driving in light of these developments.
“If somebody doesn’t believe Tesla’s going to solve autonomy, I think they should not be an investor in the company,” Musk declared on Tesla’s earnings call Tuesday. He added, “We will, and we are.”
With a software upgrade, Tesla’s current cars could become self-driving vehicles, has been Musk’s long-time assurance to investors and consumers. Nevertheless, the business does not currently create self-driving cars; instead, it primarily provides driver support technologies.
Furthermore, he has asserted safety benefits of Tesla’s driver assistance systems without permitting external analysis of the business’s records.
In 2021, for instance, Elon Musk asserted on social media “Tesla with Autopilot engaged now approaching 10 times lower chance of accident than average vehicle.”
Automotive safety researcher and associate professor of computer engineering at Carnegie Mellon University Philip Koopman stated he thinks Tesla is “autonowashing” with its marketing. In reaction to the NHTSA findings, he also expressed the hope that Tesla will proceed with caution in light of the agency’s concerns.
“People are dying due to misplaced confidence in Tesla Autopilot capabilities. Even simple steps could improve safety,” Koopman pointed out. “Tesla could automatically restrict Autopilot use to intended roads based on map data already in the vehicle. Tesla could improve monitoring so drivers can’t routinely become absorbed in their cellphones while Autopilot is in use.” He said.