‘A very big deal’: Federal safety regulator takes aim at Tesla Autopilot

After four years of laissez-faire treatment under the Trump administration, the nation’s top auto safety regulator is making it clear to Elon Musk and Tesla that there’s a new sheriff in town.

In June, the National Highway Traffic and Safety Administration ordered automakers to cough up data on every crash that involves automated driving systems, such as Tesla’s Autopilot. Last month it launched an investigation into a dozen crashes in which Teslas on Autopilot plowed into parked emergency vehicles.

Then, on Tuesday, NHTSA’s Office of Defects Investigation sent an 11-page letter instructing Tesla to provide the agency with an enormous volume of detailed data on each Tesla vehicle sold or leased in the United States from 2014 to 2021. “This could be a very big deal,” said Bryant Walker Smith, a professor at the University of South Carolina, one of the legal field’s foremost experts in automated motor vehicle law.

This Jan. 22, 2018, file still frame from video provided by KCBS-TV shows a Tesla Model S electric car that has crashed into a fire engine on Interstate 405 in Culver City, Calif. A government report says the driver of the Tesla that slammed into a firetruck was using the car's Autopilot system when a vehicle in front of him suddenly changed lanes and he didn't have time to react. The National Transportation Safety Board said Tuesday, Sept. 3, 2019, that the driver never saw the parked firetruck and didn't brake.

Back in 2016, when automated driving systems first drew broad public attention, the agency published enforcement guidelines making clear that it could enforce safety regulations governing software systems, not just traditional components such as carburetors, air bags or ignition switches.

Subsequently, however, the Trump administration took a lax approach to NHTSA enforcement. As many as 30 investigations into Tesla were launched, delving into Autopilot and other safety concerns, but the vast majority were either disbanded or are still in process.

The agency’s new activism is bad news for Tesla, whose electric car revenues have been boosted in part by the popularity of its Autopilot driver assist system, and by the $10,000 it receives from buyers of its Full Self-Driving system (which in fact is not a full self-driving system).

If the traffic and safety administration finds Autopilot or FSD defective in a way that jeopardizes public safety, the features could be recalled, a prospect that could force changes to the systems and potentially lead to a ban while safety concerns are addressed, legal experts say.

Even a finding that Tesla has promoted what NHTSA calls “predictable abuse” could cause problems for Autopilot. Tesla legal language says human drivers must pay attention at all times with Autopilot engaged, but Tesla marketing, including videos of Musk driving Teslas without using his hands, has seemingly contradicted the warnings. A growing library of YouTube videos shows Tesla drivers misusing the system, some of them crawling into the back seat while the car “drives itself.”