In Brief
Two recently reported crashes involving cars with autonomous driving systems have re-raised a key question. Who is at fault — the driver or the car?
Collision Ethics
Two separate incidents in California involving self-driving vehicles have recently gotten attention. One accident involved a Tesla Model S, the other, a Chevrolet Bolt that was using General Motors’ Cruise Automation technology. In both cases, the vehicles were reportedly using their respective autonomous driving systems.
Culver City fire service officials reported on January 22 that the Model S “plowed into the rear” of one of their fire trucks on a freeway. Supposedly, the Tesla was travelling at 105 km per hour (65 mph) in Autopilot mode when it hit the fire truck, the firefighters reported in a tweet.
While working a freeway accident this morning, Engine 42 was struck by a #Tesla traveling at 65 mph. The driver reports the vehicle was on autopilot. Amazingly there were no injuries! Please stay alert while driving! #abc7eyewitness #ktla #CulverCity #distracteddriving pic.twitter.com/RgEmd43tNe
— Culver City Firefighters (@CC_Firefighters) January 22, 2018
More than a month early in December 2017, a Chevy Bolt that was driving autonomously collided with a motorcycle as the car was changing lanes. According to the incident report filed by GM to California’s Department of Motor Vehicle, the Bolt “glanced the side” of the motorcycle. The injured cyclist filed a case against the American carmaker, the first lawsuit involving a self-driving car. The incident was only recently made public.
Though these are hardly the first car crashes involving self-driving cars, these incidents raise a question that’s been asked many times before: Who is the responsible party? The driver, or the automakers that designed the autonomous driving technology?
Hitting the Brakes
When it comes to typical vehicular accidents, determining which party is at fault is already challenging. That challenge only grows when vehicles running on autonomous systems are introduced. To be clear, however, many of the vehicles dubbed “self-driving” have not achieved full autonomy. Most still rely on certain input from a driver behind the wheel.
Alain L. Kornhauser, director of the Transportation Program at Princeton University and Princeton Autonomous Vehicle Engineering (PAVE) research group chair, thinks that in the case of these two recent crashes, the drivers share part of the blame. He told Futurism, however, that “Automated Emergency Braking [AEB] should be redesigned to work.” If a car equipped with AEB senses an impending collision and the driver does not react in time, the car will start braking on its own. According to Consumer Reports, Tesla, Subaru and Infiniti owners are the most satisfied with their vehicles’ AEB systems.
Kornhauser explained that the National Highway Traffic Safety Administration’s (NHTSA) attitude towards emergency systems in self-driving cars is faulty, favoring what he calls “crash mitigation.” Often, the AEB system doesn’t kick-in until a driver actually touches the brakes. “The ‘NHTSA mentality’ should be to avoid crashes, not just mitigate them,” Kornhauser said. “What needs to be done is that this kind of design mentality must change.”
He added: “This is why we need to perfect AEB, or what I call “safe-driving cars,” before we go all out with letting people take their hands and feet off the controls even for a little while [or] at certain times.” Most states still require a driver to be behind the wheel during autonomous vehicle testing.
Image credit: General Motors.
For truly driverless cars to successfully hit the streets, Kornhauser explained that the safe-driving aspects essentially have to be perfect. “These systems don’t have a fall guy” in the case of an accident, Kornhauser said. So producers and fleet owners won’t sell or deploy fully autonomous vehicles until these designs are even further refined.
Both Tesla and GM aren’t strangers to car crashes involving vehicles with self-driving systems. The state of California, which seems to have taken a friendlier stance on testing autonomous cars, has seen over 30 accidents involving self-driving vehicles since 2014.
Many believe that these crashes should in no way impede further research into self-driving cars. Rather, such crashes should continue to inform how these vehicles are designed and developed, with the ultimate goal of saving more lives in the long run.