Tesla didn’t fix autopilot after fatal crash, engineers say

Tesla Inc. failed to fix limitations in its Autopilot system following a gruesome Florida crash that killed a driver in 2016, company engineers said in a family’s lawsuit over a very similar 2019 fatal collision that’s headed to a jury trial.

The electric-car maker didn’t make any changes to its driver-assistance technology to account for crossing traffic in the nearly three years between two high-profile accidents that killed Tesla drivers whose cars slammed into the side of trucks, according to newly revealed testimony from multiple engineers.

This image provided by the National Transportation Safety Board shows the damage to the left front of the Tesla involved in a May 7, 2016, crash in Williston, Fla. The carsh killed Joshua Brown, 40, of Canton, Ohio, who was using the semiautonomous driving systems of his Tesla Model S sedan. The sedan struck the underside of a semitrailer that was turning onto a divided highway in Williston. The sedan's roof was sheared off before the vehicle emerged on the other side of the trailer.

After years of touting autonomous driving as the way of the future, Tesla and Chief Executive Officer Elon Musk are under legal pressure from consumers, investors, regulators and federal prosecutors who are questioning whether the company has overhyped its progress toward self-driving vehicles during the last eight years.

Tesla also is in the crosshairs of multiple investigations by the National Highway Traffic Safety Administration over possible defects in Autopilot linked to at least 17 deaths since June 2021.

The trial set for October, the first for the company over a death blamed on Autopilot, will pit Musk’s repeated assertion that Teslas are the safest cars ever made against technology experts expected to testify that the company’s marketing has lulled drivers into a false sense of security.

Musk was excused from being questioned in the case by a Florida judge last year. The billionaire chief executive is “hands-on,” “very involved with the product’s definition” and “very involved with making certain decisions around how things should work” with Autopilot, according to excerpts from a 2020 deposition of Tesla’s former director of Autopilot software, Christopher “CJ” Moore, in the family’s revised complaint.

Tesla’s attorneys representing the company didn’t immediately respond to requests for comment.

The automaker contends it has been transparent about Autopilot’s limitations, including challenges with detecting traffic crossing in front of its cars. Tesla warns in its owner’s manual and car screens that drivers must be alert and ready to take control of vehicles at any time.