The federal government’s top auto-safety agency is significantly expanding an investigation into Tesla and its Autopilot driver-assistance system to determine if the technology poses a safety risk.
The agency, the National Highway Traffic Safety Administration, said Thursday that it was upgrading its preliminary evaluation of Autopilot to an engineering analysis, a more intensive level of scrutiny that is required before a recall can be ordered.
The analysis will look at whether Autopilot fails to prevent drivers from diverting their attention from the road and engaging in other predictable and risky behavior while using the system.
“We’ve been asking for closer scrutiny of Autopilot for some time,” said Jonathan Adkins, executive director of the Governors Highway Safety Association, which coordinates state efforts to promote safe driving.
NHTSA has said it is aware of 35 crashes that occurred while Autopilot was activated, including nine that resulted in the deaths of 14 people. But it said Thursday that it had not determined whether Autopilot has defects that can cause cars to crash while it is engaged.
The wider investigation covers 830,000 vehicles sold in the United States. They include all four Tesla cars — the Models S, X, 3 and Y — in model years from 2014 to 2021. The agency will look at Autopilot and its various component systems that handle steering, braking and other driving tasks, and a more advanced system that Tesla calls Full Self-Driving.
Tesla did not respond to a request for comment on the agency’s move.
The preliminary evaluation focused on 11 crashes in which Tesla cars operating under Autopilot control struck parked emergency vehicles that had their lights flashing. In that review, NHTSA said Thursday, the agency became aware of 191 crashes — not limited to ones involving emergency vehicles — that warranted closer investigation. They occurred while the cars were operating under Autopilot, Full Self-Driving or associated features, the agency said.
Tesla says the Full Self-Driving software can guide a car on city streets but does not make it fully autonomous and requires drivers to remain attentive. It is also available to only a limited set of customers in what Tesla calls a “beta” or test version that is not completely developed.
The deepening of the investigation signals that NHTSA is more seriously considering safety concerns stemming from a lack of safeguards to prevent drivers from using Autopilot in a dangerous manner.
“This isn’t your typical defect case,” said Michael Brooks, acting executive director at the Center for Auto Safety, a nonprofit consumer advocacy group. “They are actively looking for a problem that can be fixed, and they’re looking at driver behavior, and the problem may not be a component in the vehicle.”
Read More About Elon Musk and Twitter
Tesla and its chief executive, Elon Musk, have come under criticism for hyping Autopilot and Full Self-Driving in ways that suggest they are capable of piloting cars without input from drivers.
“At a minimum they should be renamed,” said Mr. Adkins of the Governors Highway Safety Association. “Those names confuse people into thinking they can do more than they are actually capable of.”
Competing systems developed by General Motors and Ford Motor use infrared cameras that closely track the driver’s eyes and sound warning chimes if a driver looks away from the road for more than two or three seconds. Tesla did not initially include such a driver monitoring system in its cars, and later added only a standard camera that is much less precise than infrared cameras in eye tracking.
Tesla tells drivers to use Autopilot only on divided highways, but the system can be activated on any streets that have lines down the middle. The G.M. and Ford systems — known as Super Cruise and BlueCruise — can be activated only on highways.
Autopilot was first offered in Tesla models in late 2015. It uses cameras and other sensors to steer, accelerate and brake with little input from drivers. Owner manuals tell drivers to keep their hands on the steering wheel and their eyes on the road, but early versions of the system allowed drivers to keep their hands off the wheel for five minutes or more under certain conditions.
Unlike technologists at almost every other company working on self-driving vehicles, Mr. Musk insisted that autonomy could be achieved solely with cameras tracking their surroundings. But many Tesla engineers questioned whether relying on cameras without other sensing devices was safe enough.
Mr. Musk has regularly promoted Autopilot’s abilities, saying autonomous driving is a “solved problem” and predicting that drivers will soon be able to sleep while their cars drive them to work.
Questions about the system arose in 2016 when an Ohio man was killed when his Model S crashed into a tractor-trailer on a highway in Florida while Autopilot was activated. NHTSA investigated that crash and in 2017 said it had found no safety defect in Autopilot.
But the agency issued a bulletin in 2016 saying driver-assistance systems that fail to keep drivers engaged “may also be an unreasonable risk to safety.” And in a separate investigation, the National Transportation Safety Board concluded that the Autopilot system had “played a major role” in the Florida crash because while it performed as intended, it lacked safeguards to prevent misuse.
Tesla is facing lawsuits from families of victims of fatal crashes, and some customers have sued the company over its claims for Autopilot and Full Self-Driving.
Last year, Mr. Musk acknowledged that developing autonomous vehicles was more difficult than he had thought.
NHTSA opened its preliminary evaluation of Autopilot in August and initially focused on 11 crashes in which Teslas operating with Autopilot engaged ran into police cars, fire trucks and other emergency vehicles that had stopped and had their lights flashing. Those crashes resulted in one death and 17 injuries.
While examining those crashes, it discovered six more involving emergency vehicles and eliminated one of the original 11 from further study.
At the same time, the agency learned of dozens more crashes that occurred while Autopilot was active and that did not involve emergency vehicles. Of those, the agency first focused on 191, and eliminated 85 from further scrutiny because it could not obtain enough information to get a clear picture if Autopilot was a major cause.
In about half of the remaining 106, NHTSA found evidence that suggested drivers did not have their full attention on the road. About a quarter of the 106 occurred on roads where Autopilot is not supposed to be used.
In an engineering analysis, NHTSA’s Office of Defects Investigation sometimes acquires vehicles it is examining and arranges testing to try to identify flaws and replicate problems they can cause. In the past it has taken apart components to find faults, and has asked manufacturers for detailed data on how components operate, often including proprietary information.
The process can take months or even a year or more. NHTSA aims to complete the analysis within a year. If it concludes a safety defect exists, it can press a manufacturer to initiate a recall and correct the problem.
On rare occasions, automakers have contested the agency’s conclusions in court and prevailed in halting recalls.