Tesla is taking PR very seriously after one of its vehicles in autonomous mode killed a passenger recently.
The crash occurred at 9:27 AM on Highway 101 near Mountain View, California. Walter Huang was in the driver’s seat of the Model X, which was in autonomous mode. The car hit a concrete highway divider, marked with black and yellow chevrons, at full force. Huang didn’t take any action. The SUV crumpled like a tin can, and Huang didn’t make it.
The investigation into the fatal #Tesla crash continued today with the #CHP & #NTSB digging through the scorched EV ? https://t.co/rfdgY88bn7 pic.twitter.com/vd2YzFmAZ0
— Dean C. Smith (@DeanCSmith) March 29, 2018
Other information has been hard to come by, due to the severity of the damage. So far we don’t know if his death was a result of negligence, a fatal nap, or simply being distracted by the fireworks of warning lights, and sounds. But one thing is clear: the crash proves that audio and visual cues on the dashboard could after all be insufficient to prevent a crash.
Huang wasn’t the first to die in a Tesla with Autopilot active. In 2016, Joshua Brown crashed his Model S into a truck, marking fatal collision while Autopilot was engaged.
The timing for this particular crash isn’t exactly ideal (from Tesla’s perspective). Uber is already doing damage control after its self-driving car killed a pedestrian in Arizona on March 19, four days before Huang’s fatal collision.
Interestingly, officials aren’t too pleased about Tesla’s PR offensive. On Sunday, a spokesperson for the U.S. National Transportation Safety Board (NTSB) told the Washington Post:
At this time the NTSB needs the assistance of Tesla to decode the data the vehicle recorded. In each of our investigations involving a Tesla vehicle, Tesla has been extremely cooperative on assisting with the vehicle data. However, the NTSB is unhappy with the release of investigative information by Tesla.
Presumably, investigators aren’t happy because they’d like to get as much information as they can, then release a report.
But Tesla might have jumped the gun. Not complying with the NTSB’s investigation processes and deadlines might end up having their technological advancements (and security improvements) screech to a halt.
After the Uber car’s crash, the company was banned from further testing in Arizona (though other companies were allowed to continue). Many people feared that the crash would fray the public’s trust in autonomous vehicles, and that largely has not come to pass, at least not yet.
But if the crashes continue, that could change. The market for autonomous cars could dry up before the technology becomes reliable enough to make them widespread.
Tesla’s Autopilot is Level 2 autonomy, while Uber’s self-driving car is a Level 4. So the technology isn’t even really the same. Still, a turn in the tide of public opinion could sweep both up with it.
Autonomous vehicles aren’t the best at sharing the unpredictable road with imprecise humans. Yes, once fully autonomous vehicles roll out all over the country and make up 100 percent of the vehicles on the road, American roads will inevitably become safer.
But we’re not there yet. If crashes like these keep happening, and the public loses trust, we might never be.