In October 2021, Missy Cummings left her engineering professorship at Duke University to join the National Highway Transportation Safety Administration (NHTSA) in a temporary position as a senior safety advisor. It wasn’t long before Elon Musk tweeted an attack: “Objectively, her track record is extremely biased against Tesla.” He was referring to Cummings’s criticism of his company’s Autopilot, which is supposed to help the driver drive, though some customers have used it to make the car drive itself—sometimes with disastrous results.
Some of Musk’s fans followed his lead: Cummings received a slew of online attacks, some of them threatening.
As a former Navy fighter pilot,Cummings was used to living dangerously. But she hates taking unnecessary risks, particularly on the road. At NHTSA, she scrutinized data on cars operating under varying levels of automation, and she pushed for safer standards around autonomy. Now out of the government and in a new academic perch at George Mason University, she answered five high-speed questions from IEEE Spectrum.
We are told that today’s cars, with their advanced driver-assistance systems (ADAS), are fundamentally safer than ever before. True?
Cummings: There is no evidence of mitigation. At NHTSA we couldn’t answer the question that you’re less likely to get in a crash—no data. But if you are in an accident, you’re more likely to be injured, because people in ADAS-equipped cars are more likely to be speeding.
Could it be that people are trading the extra safety these systems might otherwise have provided for other things, like getting home 3 minutes sooner?
Cummings: I call it risk homeostasis. It’s a big problem with Tesla, for example. You’re told it has self-driving capability, with all these features, such as automatic braking. Oh, the car is going to do x, y, and z for me, and then it turns out that it doesn’t.
Did you observe risk homeostasis back in your fighter-pilot days?
Cummings: It happened with air-to-ground bombing radar. Pilots figured out that you could use it to set up a self-contained approach to an aircraft carrier and then manage the landing by yourself. Given the control freaks that pilots are, it happened. But the system didn’t adjust for the pitching deck, so it set people up for much more lethal approaches.
Some have said that partial autonomy is the riskiest solution of all. What’s your take?
The policy should be that either the computer is driving or you are driving. And by driving I mean steering—people do fine with regular cruise control. The act of keeping your hands on the wheel and guiding the car’s lateral motion is enough to keep your brain engaged. So, no L3 [full self-driving, but the driver must be ready to take the wheel], which is too confusing, and no hands-free L2 [partial self-driving]. I am not against the passing of control per se, but there should just be two modes of operation, with crystal clear feedback about which mode you are in.
When do you think true self-driving cars will come?
Cummings: It’s possible to do self-driving in narrow applications. Waymo has been giving rides for a long time in Chandler [Ariz.]. That environment is very structured, and it’s much easier to operate these systems in. My favorite application is last-mile delivery, say, food delivery; it could be very helpful when, say, viruses spike. But the day when AI in cars can handle all conditions on the road, all of the time—it’s not going to be in my lifetime.
Mary (Missy) L. Cummings is the director of the Autonomy and Robotics Center at George Mason University and a senior member of IEEE. She received a Ph.D. in systems engineering from the University of Virginia.
From Your Site Articles
Related Articles Around the Web