Dawn Project video shows Tesla’s driver monitoring system fails to detect sleeping driver, teddy bear, and no one at the wheel

SANTA BARBARA, Calif., Aug. 3, 2023 /PRNewswire/ — The Dawn Project has today released safety test videos showing that a self-driving Tesla’s driver monitoring system fails to detect when a driver texts, reads, watches movies or even falls asleep at the wheel. The car also does not recognise when a teddy bear, unicorn, or nothing at all is in the driver’s seat.

Video: Tesla’s driver monitoring system fails to detect stuffed animals and balloons at the wheel and allows the car to drive with no one in the driver’s seat

Video: Tesla’s driver monitoring system fails to detect a person watching a movie on a laptop, texting, reading a book or asleep at the wheel

Tesla warns that its self-driving software “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road”. It also warns that the software can cause the car to “suddenly swerve even when driving conditions appear normal and straight-forward”.

Regulators have only allowed this defective software to be sold to millions of ordinary consumers with the requirement that there is a driver in the car who is paying attention to the road, has both hands on the steering wheel, and is ready to take over immediately.

Research has shown that the only way to ensure a driver is paying attention is to implement an effective driver monitoring system using cameras.

Tesla duped NHTSA and the California DMV to designate Full Self-Driving as a Level 2 Advanced Driver Assistance System, while simultaneously marketing it as a fully autonomous car.

Joshua Brown’s fatal self-driving collision with a tractor-trailer in June 2016 was attributed to driver inattention by the National Transportation Safety Board (NTSB). The NTSB said the truck should have been visible to Brown for at least seven seconds before impact. Brown “took no braking, steering or other actions to avoid the collision“, the NTSB report said. As a result, NHTSA required Tesla to add a driver monitoring system.

The latest tests were conducted on a real road in Santa Barbara, with a person in the passenger seat ready to take over. The tests were conducted to establish whether Tesla’s driver monitoring system would detect and issue a “pay attention to the road” warning when faced with the below scenarios:

The Dawn Project invited John Bernal, a former Tesla Autopilot employee, who covers electric vehicle news on his YouTube Channel, AIAddict, to conduct and observe the tests involving non-human objects in the driver’s seat. The tests were run in two separate Model 3 Teslas, which had not been modified in any way.

These tests follow a video recently released by The Dawn Project showing a self-driving Tesla driver looking out of the window for five minutes while eating a meal, as well as rummaging in the back seats for five minutes, all without receiving any warnings from the driver monitoring system.

Dan O’Dowd, Founder of The Dawn Project, commented: “Tesla’s driver monitoring system is ineffective and unfit for purpose. NHTSA forced Tesla to introduce a driver monitoring system to ensure the driver is paying attention to the road. However, Tesla duped the regulator by implementing an ineffective driver monitoring system.

“Did Tesla knowingly ship a defective driver monitoring system that fails to detect driver inattention?

“This ineffective driver monitoring system is in over 4 million Tesla vehicles made in the last five years. We tested it on two cars, and achieved the exact same results. Pedestrians, cyclists and drivers have no way of knowing whether the person “supervising” an ineffective self-driving Tesla is actually paying attention to the road, or is asleep at the wheel.

“In order to avoid the stringent regulatory approval process applied to Level 4 autonomous vehicles, Tesla duped NHTSA and the DMV into regulating Autopilot and Full Self-Driving as Level 2 Advanced Driver Assistance Systems.

“Our videos show that Teslas will drive autonomously with no one sitting in the driver’s seat and the steering wheel moving back and forth, just like in a Waymo or Cruise Level 4 autonomous robotaxi. Teslas are Level 4 autonomous vehicles just like the names “Autopilot” and “Full Self-Driving” promise, as had always been Tesla’s intention. Elon Musk recently boasted that people “only fully understand when they’re in [the] driver’s seat, but aren’t driving at all.”

“Tesla has reported 840 accidents and 23 fatalities to the NHTSA. Tesla has made over $4 billion dollars of additional revenue from its self-driving software.

“A number of fatalities occurred when a self-driving Tesla failed to recognise a large tractor trailer crossing the highway in front of the vehicle, shearing off the Tesla’s roof and killing the driver.

“This raises the question, why didn’t the driver brake? A tractor trailer is usually a very easy thing for a person to see. However, in many of the accidents, it appears that the driver took no action whatsoever to avoid the fatal collision. The only reason the driver would not brake before hitting such a large object is because they were not paying attention.

“If the driver wasn’t paying attention, why weren’t they warned by Tesla’s driver monitoring system, and why didn’t the system disengage self-driving mode as it is supposed to?”

Notes to editors

Dan O’Dowd is an entrepreneur and CEO with over 40 years’ experience in designing and writing secure, safety-critical software. Dan has built operating systems for the U.S. military’s fighter jets and some of the world’s most trusted organizations such as NASA, Boeing, and Airbus.

In 2021, Dan O’Dowd founded The Dawn Project, which campaigns to make computers safe for humanity by ensuring that all software in safety-critical infrastructure never fails and can’t be hacked. The first danger The Dawn Project is tackling is Tesla’s deployment of unsafe Full Self-Driving cars on our roads.

Video: Tesla’s driver monitoring system fails to detect stuffed animals and balloons at the wheel and allows the car to drive with a visibly empty driver’s seat

Video: Tesla’s driver monitoring system fails to detect a human watching a movie on a laptop, texting, reading a book or asleep at the wheel.

Photo – https://mma.prnewswire.com/media/2169216/Test_A.jpg

Photo – https://mma.prnewswire.com/media/2169217/Test_B.jpg

SOURCE The Dawn Project


Go to Source