Engineers inside Tesla wanted to add robust driver monitoring systems to the company’s cars to help make sure drivers safely use Autopilot, and Tesla even worked with suppliers on possible solutions, according to The Wall Street Journal. But those executives — Elon Musk included — reportedly rejected the idea out of worry that the options might not work well enough, could be expensive, and because drivers might become annoyed by an overly nagging system.
Tesla considered a few different types of monitoring: one that would track a driver’s eyes using a camera and infrared sensors, and another that involved adding more sensors to the steering wheel to make sure that the driver is holding on. Both ideas would help let the car’s system know if the driver has stopped paying attention, which could reduce the chance of an accident in situations where Autopilot disengages or is incapable of keeping the car from crashing.
Musk later confirmed on Twitter that the eye tracking option was “rejected for being ineffective, not for cost.”
This is false. Eyetracking rejected for being ineffective, not for cost. WSJ fails to mention that Tesla is safest car on road, which would make article ridiculous. Approx 4X better than avg.
— Elon Musk (@elonmusk) May 14, 2018
While a name like “Autopilot” might suggest there aren’t situations a Tesla car can’t handle, accidents still happen even when Autopilot is engaged, and three people have died while using the feature. Tesla promises that Autopilot will one day be capable of fully driving the car itself, but the system currently more closely resembles the limited driver assistance packages offered by GM, Nissan, and others.
Tesla cars do lightly monitor drivers by using a sensor to measure small movements in the steering wheel. If the driver doesn’t have their hands on the wheel, they are repeatedly warned, and eventually the car pulls itself to the side of the road and has to be reset before Autopilot can be turned on again. That capability had to be added months after Autopilot was released in 2015, though, after a rash of drivers posted videos of themselves using the driver assistance feature in reckless ways. Even now, there is evidence that it’s possible to fool the steering wheel sensor.
In contrast, GM’s semi-autonomous system, Super Cruise, watches a driver’s face to make sure they’re paying attention to the road. It also allows hands-free driving.
Broadly, though, the National Transportation Safety Board said last September that the whole industry needs to do better at installing safeguards that help make sure these driver assistance features aren’t misused.
The NTSB’s statements came with the conclusion of the safety board’s investigation into the June 2016 death of Joshua Brown, who was the first person to die while using Autopilot in the United States. (A driver who was killed while using Autopilot in China in January 2016 is now believed to be the first person in the world to have been killed while using a driver assistance feature.) At the time, the safety board specifically recommended that Tesla find ways beyond steering wheel sensors to monitor drivers. The NTSB is currently investigating the most recent Autopilot death, which happened in March in California.
Tesla often points out that the number of accidents involving the use of Autopilot is small compared to the scale and frequency of more typical auto accidents. And Musk recently pledged to regularly release data about the performance of Autopilot, which it will start doing at the end of this financial quarter. But Musk also recently said that Autopilot accidents tend to happen because drivers’ attention can drift — something that might be solved with better driver monitoring.
“When there is a serious accident it is almost always, in fact maybe always, the case that it is an experienced user, and the issue is more one of complacency,” Musk said on a quarterly earnings call earlier this month. “They just get too used to it. That tends to be more of an issue. It’s not a lack of understanding of what Autopilot can do. It’s [drivers] thinking they know more about Autopilot than they do.”