Tesla wins Autopilot crash case in California

Tesla scored a victory Friday after a California jury determined the automaker was not to blame for a 2019 crash that involved its advanced driver assistance system, known as Autopilot.

The jury awarded no damages to Los Angeles resident Justine Hsu, who sued Tesla in 2020 alleging negligence, fraud and breach of contract. This appears to be the first case involving Autopilot to go to trial. Reuters was the first to report the verdict.

Hsu said in her lawsuit that the Tesla Model S she was driving had Autopilot engaged when it turned into and hit a median on a city street. Airbags deployed fracturing her jaw and causing nerve damage, the lawsuit alleged.

The jury determined that Tesla was not to blame and the airbag deployed as it should. The court filing also said Tesla properly warned users not to use the system while driving on city streets, which Hsu did.

The court ruling provides a win for Tesla as it faces increased scrutiny from federal and state regulators over Autopilot as well as the upgraded versions of the system called Enhanced Autopilot and Full Self-Driving software.

Tesla vehicles come standard with a driver-assistance system branded as Autopilot. For a $6,000 upgrade, owners can purchase Enhanced Autopilot, which includes several other features.

For an additional $15,000, owners can buy “full self-driving,” or FSD — a feature that CEO Elon Musk has promised for years will one day deliver full autonomous driving capabilities. Tesla vehicles are not self-driving.

Instead, FSD includes a number of automated driving features that still require the driver to be ready to take control at all times. It includes the parking feature Summon, as well as Navigate on Autopilot, an active guidance system that navigates a car from a highway on-ramp to off-ramp, including interchanges and making lane changes. The system is also supposed to handle steering on city streets and recognize and react to traffic lights and stop signs.

In February, Tesla paused the rollout of its Full Self-Driving beta software in the United States and Canada following a recall of the system that federal safety regulators warned could allow vehicles to act unsafe around intersections and cause crashes.

Go to Source