Tesla’s Favorite Autopilot Safety Statistic Doesn’t Hold Up

For more than a year, Tesla has defended its semiautonomous Autopilot as a vital, life-saving feature. CEO Elon Musk has lambasted journalists who write about crashes involving the system. “It’s really incredibly irresponsible of any journalists with integrity to write an article that would lead people to believe that autonomy is less safe,” he said during a tumultuous earnings call this week. “Because people might actually turn it off, and then die.”

This wasn’t the first time Musk has made this argument about Autopilot, which keeps the car in its lane and a safe distance from other vehicles but requires constant human oversight, and has been involved in two fatal crashes in the US. “Writing an article that’s negative, you’re effectively dissuading people from using autonomous vehicles, you’re killing people,” he said on an October 2016 conference call.

Wednesday’s haranguing, however, came a few hours after the National Highway Traffic Safety Administration (NHTSA) indicated that Tesla has been misconstruing the key statistic it uses to defend its technology. Over the past year and a half, Tesla spokespeople have repeatedly said that the agency has found Autopilot to reduce crash rates by 40 percent. They repeated it most recently after the death of a Northern California man whose Model X crashed into a highway safety barrier while in Autopilot mode in March.

Now NHTSA says that’s not exactly right—and there’s no clear evidence for how safe the pseudo-self-driving feature actually is.

The remarkable stat comes from a January 2017 report that summarized NHTSA’s investigation into the death of Joshua Brown, whose Model S crashed into a truck turning across its path while in Autopilot mode. According to its data, model year 2014 through 2016 Teslas saw 1.3 airbag deployments per million miles, before Tesla made Autopilot available via an over-the-air software update. Afterward, the rate was 0.8 per million miles. “The data show that the Tesla vehicles’ crash rate dropped by almost 40 percent after Autosteer installation,” the investigators concluded.

Just a few problems. First, as reported by Reuters and confirmed to WIRED, NHTSA has reiterated that its data came from Tesla, and has not been verified by an independent party (as it noted in a footnote in the report). Second, it says its investigators did not consider whether the driver was using Autopilot at the time of each crash. (Reminder: Drivers are only supposed to use Autopilot in very specific contexts.) And third, airbag deployments are an inexact proxy for crashes. Especially considering that in the death that triggered the investigation, the airbags did not deploy.

Tesla declined to comment on NHTSA’s clarification.

The statistic has been the subject of controversy for some time. The research firm Quality Control Systems Corp. has filed a Freedom of Information Act lawsuit against NHTSA for the underlying data in that 2017 report, which it hopes to use to determine whether the 40 percent figure is valid. NHTSA has thus far denied its FOIA requests, saying it agreed to Tesla’s requests to keep the data confidential, and that its release could threaten the carmakers’ competitiveness.

Tesla’s oft-touted figure is flawed for another reason, experts say: With this data set, you can’t separate the role of Autopilot from that of automatic emergency braking, which Tesla began releasing just a few months before Autopilot. According to the Insurance Institute for Highway Safety, vehicles that can detect imminent collisions and hit the brakes on their own suffer half as many rear-end crashes as those that can’t. (More than 99 percent of cars Tesla produced in 2017 came equipped with the feature standard, a higher proportion than any other carmaker.)

Which is all to say, determining whether a new feature like Autopilot is safe, especially if you don’t have access to lots of replicable, third-party data, is super, super hard. Tesla’s beloved 40 percent figure comes with so many caveats, it’s unreliable.

Big Deal Data

The Insurance Institute for Highway Safety has tried to come at the question another way, by looking at the frequency of insurance claims. When it tried to separate Model S sedan incidents after Autopilot was released, it observed no changes in the frequency of property damage and bodily injury liability claims. That indicates that Autopilot drivers aren’t more or less less likely to damage their cars or get hurt than others. But it did find a 13 percent reduction in collision claim frequency, indicating sedans with Autopilot enabled got into fewer crashes that resulted in collision claims to insurers.

Oh, but it gets more complicated. IIHS couldn’t tell which crashes actually involved the use Autopilot, and not just sedans equipped with Autopilot. And it’s way too early for definitive answers. “Since other safety technologies are layered below Autopilot, it is difficult to tease out results for Autopilot alone at this time,” says Russ Rader, an IIHS spokesperson. “Data on insurance claims for the Model S are still thin.”

Over at MIT, researchers frustrated with the dearth of good info on Autopilot and other semiautonomous car features have launched their own lines of inquiry. Human guinea pigs are now driving sensor- and camera-laden Teslas, Volvos, and Range Rovers around the Boston area. The researchers will use the data they generate to understand how safely humans operate those vehicles.

The upshot is that Autopilot might, in fact, be saving a ton of lives. Or maybe not. We just don’t know. And Tesla hasn’t been transparent with its own numbers. “You would need a rigorous statistical analysis with clear data indicating what vehicle has it and what vehicle doesn’t and whether it’s enabled or whether it isn’t,” says David Friedman, a former NHTSA official who now directs car policy at Consumers Union. Tesla said this week that it would begin publishing quarterly Autopilot safety statistics, but did not indicate whether its data would be verified by a third party.

NHTSA, too, could be doing a better at holding innovative but opaque carmakers like Tesla accountable for proving the safety of their new tech. “To me, they should be more transparent by asking Tesla for disengagements of the system: How often the systems disengaged, how often the humans need to take over,” Friedman says. California’s Department of Motor Vehicles requires companies testing autonomous vehicles in the state to provide annual data on disengagements, to help officials understand the limitations of the tech and its progress.

Tesla is not alone among carmakers in trying to shield sensitive info from the public. But today, humans are deeply bewildered about the semiautonomous features that have already made their way into everyday drivers’ garages. Even Transportation Secretary Elaine Chao—you know, the public official charged with overseeing regulation on these things—is confused by the terminology. If there’s a time to get honest about your numbers, this could be it. You might save some lives. You’d definitely save a few headaches.

Tesla Turmoil

Elon Musk’s attack on investors could make him a liability Buying a Tesla? Don’t count on that $7,500 tax creditTesla’s wild fight with the feds investigating its latest Autopilot death

Go to Source