Shortly before 2 p.m. on a clear July day in 2020, as Tracy Forth was driving near Tampa, Fla., her white Tesla Model S was hit from behind by another car in the left lane of Interstate 275.
It was the kind of accident that occurs thousands of times a day on American highways. When the vehicles collided, Ms. Forth’s car slid into the median as the other one, a blue Acura sport utility vehicle, spun across the highway and onto the far shoulder.
After the collision, Ms. Forth told police officers that Autopilot — a Tesla driver-assistance system that can steer, brake and accelerate cars — had suddenly activated her brakes for no apparent reason. She was unable to regain control, according to the police report, before the Acura crashed into the back of her car.
But her description is not the only record of the accident. Tesla logged nearly every particular, down to the angle of the steering wheel in the milliseconds before impact. Captured by cameras and other sensors installed on the car, this data provides a startlingly detailed account of what occurred, including video from the front and the rear of Ms. Forth’s car.
It shows that 10 seconds before the accident, Autopilot was in control as the Tesla traveled down the highway at 77 miles per hour. Then she prompted Autopilot to change lanes.
The data collected by Ms. Forth’s Model S was no fluke. Tesla and other automakers increasingly capture such information to operate and improve their driving technologies.
The automakers rarely share this data with the public. That has clouded the understanding of the risks and rewards of driver-assistance systems, which have been involved in hundreds of crashes over the past year.
But experts say this data could fundamentally change the way regulators, police departments, insurance companies and other organizations investigate anything that happens on the road, making such investigations more accurate and less costly.
It could also improve the way cars are regulated, giving government officials a clearer idea of what should and should not be allowed. Fatalities on the country’s highways and streets have been climbing in recent years, reaching a 20-year high in the first three months of this year, and regulators are trying to find ways to reverse the trend.
“This can help separate crashes related to technology from crashes related to driver error,” said Bryan Reimer, a research scientist at the Massachusetts Institute of Technology who specializes in driver-assistance systems and automated vehicles.
This data is significantly more extensive and specific than the information collected by event data recorders, also known as “black boxes,” which have long been installed on automobiles. Those devices collect data in the few seconds before, during and after a crash.
Tesla’s data, by contrast, is a constant stream of information that includes video of the car’s surroundings and statistics — sometimes called vehicle performance data or telematics — that further describes its behavior from millisecond to millisecond.
This provides a comprehensive look at the vehicle collecting the data as well as insight into the behavior of other cars and objects on the road.
Video alone provides insight into crashes that was rarely available in the past. In April, a motorcyclist was killed after colliding with a Tesla in Jacksonville, Fla. Initially, the Tesla’s owner, Chuck Cook, told the police that he had no idea what had happened. The motorcycle struck the rear of his car, out of his field of vision. But video captured by his Tesla showed that crash occurred because the motorcycle had lost a wheel. The culprit was a loose lug nut.
When detailed statistics are paired with such video, the effect can be even more powerful.
Matthew Wansley, a professor at the Cardozo School of Law in New York who specializes in emerging automotive technologies, saw this power during a stint at a self-driving car company in the late 2010s. Data gathered from cameras and other sensors, he said, provided extraordinary insight into the causes of crashes and other traffic incidents.
“We not only knew what our vehicle was doing at any given moment, right down to fractions of a second, we knew what other vehicles, pedestrians and cyclists were doing,” he said. “Forget eyewitness testimony.”
In a new academic paper, he argues that all carmakers should be required to collect this kind of data and openly share it with regulators whenever a crash — any crash — occurs. With this data in hand, he believes, the National Highway Traffic Safety Administration can improve road safety in ways that were previously impossible.
The agency, the country’s top auto safety regulator, is already collecting small amounts of this data from Tesla as it investigates a series of crashes involving Autopilot. Such data “strengthens our investigation findings and can often be helpful in understanding crashes,” the agency said in a statement.
Others say this data can have an even larger effect. Ms. Forth’s lawyer, Mike Nelson, is building a business around it.
Hannah Yoon for The New York Times
Backed by data from her Tesla, Ms. Forth ultimately decided to sue the driver and the owner of the car that hit her, claiming that the car tried to pass hers at an unsafe speed. (A lawyer representing the other car’s owner declined to comment.) But Mr. Nelson says such data has more important uses.
His recently founded start-up, QuantivRisk, aims to collect driving data from Tesla and other carmakers before analyzing it and selling the results to police departments, insurance companies, law offices and research labs. “We expect to be selling to everybody,” said Mr. Nelson, a Tesla driver himself. “This is a way of gaining a better understanding of the technology and improving safety.”
Mr. Nelson has obtained data related to about 100 crashes involving Tesla vehicles, but expanding to much larger numbers could be difficult. Because of Tesla’s policies, he can gather the data only with the approval of each individual car owner.
Tesla’s chief executive, Elon Musk, and a Tesla lawyer did not respond to requests for comment for this article. But Mr. Nelson says he thinks Tesla and other carmakers will ultimately agree to share such data more widely. It may expose when their cars malfunction, he says, but it will also show when the cars behave as advertised — and when drivers or other vehicles are at fault.
“The data associated with driving should be more open to those that need to understand how accidents happen,” Mr. Nelson said.
Mr. Wansley and other experts say that openly sharing data in this way could require a new legal framework. At the moment, it is not always clear whom the data belongs to — the carmaker or the car owner. And if the carmakers start sharing the data without the approval of car owners, this could raise privacy concerns.
“For safety-related data, the case for openly sharing this data is pretty strong,” Mr. Wansley said. “But there will be a privacy cost.”
Mr. Reimer, of M.I.T., also cautions that this data is not infallible. Though it is highly detailed, it can be incomplete or open to interpretation.
With the crash in Tampa, for instance, Tesla provided Mr. Nelson with data for only a short window of time. And it is unclear why Autopilot suddenly hit the brakes, though the truck on the side of the road seems to be the cause.
But Mr. Reimer and others also say the video and other digital data collected by companies like Tesla could be a great asset.
“When you have objective data,” he said, “opinions don’t matter.”