Tesla “did not adequately ensure that drivers maintained their attention.”
Checked Out
Federal regulators released a report this week in which they found that Tesla’s controversial driver assistance software was linked to hundreds of injuries and dozens of deaths.
The National Highway Traffic Safety Administration found that those making use of the EV maker’s misleadingly called “Full Self-Driving” software were lulled into a false sense of security and “were not sufficiently engaged in the driving task.”
In short, the report found, the Elon Musk-led company’s tech “did not adequately ensure that drivers maintained their attention” — a damning indictment that builds on a wealth of data suggesting Tesla’s experimental software isn’t just flawed but potentially dangerous to use.
Oversold
The timing of the report couldn’t come at a worse time. During a disastrous first-quarter earnings call, Musk announced that the company is doubling down on the development of a “robotaxi,” allegedly meant to save the automaker from financial ruin.
At this juncture, what the regulator found shouldn’t come as much of a surprise. The NHTSA has already investigated just shy of 1,000 crashes linked to the company’s driver assistance software, ranging from January 2018 to August 2023 and including 29 deaths.
The agency concluded that Tesla’s software didn’t do enough to ensure that drivers were paying attention to the road and keeping their hands on the wheel. As a result, it found, many drivers become too complacent and failed to seize control.
The regulator found that in many crashes, drivers had “five or more seconds” to intervene prior to crashing. In 19 of the crashes, drivers had ten or more seconds to act.
Misled
The NHTSA also took aim at the company’s misleading marketing, which has long been a source of contention.
“Notably, the term ‘Autopilot’ does not imply an L2 assistance feature, but rather elicits the idea of drivers not being in control,” the NHTSA wrote in its report, referring to Level 2 Autonomy, or partial driving automation, where the driver must be able to intervene at any time. “This terminology may lead drivers to believe that the automation has greater capabilities than it does and invite drivers to overly trust the automation.”
Meanwhile, Tesla issued a December over-the-air software update in an apparent attempt to make it more obvious when its driver assistance feature is engaged.
The update was met with plenty of criticism, with drivers complaining about many more “visual nags.”
“They designed a super convenient system that from a driver’s perspective is fantastic,” Steven Cliff, a former top administrator at the NHTSA, told the Wall Street Journal at the time.
“But from a safety perspective, that’s terrible,” he added.
More on Tesla: Cybertruck Driver Goes Berserk With Road Rage After Mild Teasing
Share This Article