The US Department of Transportation has expanded its investigation into Tesla’s Autopilot driver assistance system after a series of rear-end collisions. Since the investigation began in August, she has identified six other incidents in which Teslas with the “Autopilot” system turned on crashed into emergency vehicles parked on the side of the road. Originally, there were eleven such accidents. The most recent crash happened in January.
The investigations are now to be expanded, among other things, with the evaluation of additional data, as the traffic authority NHTSA announced in a document. She also looks at well over 100 “autopilot” accidents that didn’t involve emergency vehicles. It should also be examined to what extent the system of the electric car manufacturer increases the risk of human error. The NHTSA sees signs that in around 50 of the accidents investigated, the drivers reacted inadequately to the traffic situation.
Tesla points out to the customers themselves that the system with the misleading name “Autopilot” is only an assistance system. That’s why the person in the driver’s seat must keep their hands on the steering wheel at all times. He should always be ready to take control. Nevertheless, it happens again and again that drivers rely completely on the “autopilot” system. Meanwhile, also warns Court opinion on dangers from Tesla’s autopilot.
Tesla tightened its safety measures a few years ago: The software notices when the driver’s hands are not on the wheel and emits warning tones after a short time. According to the NHTSA, the current “Autopilot” investigation involves an estimated 830,000 vehicles of all four current model series from the years 2014 to 2022.
Fatal accident in 2016
The NHTSA had already investigated the “autopilot” system after a fatal accident in 2016. At that time, a driver died after his Tesla crashed under the trailer of a semi truck crossing the road. The NHTSA concluded that the system worked correctly, within its capabilities, but that the human driver relied on it too much. The “autopilot” system had not recognized the trailer with its white side panel and had not braked. The driver didn’t respond either.
In the current investigation, the NHTSA pointed out that in all rear-end collisions, the fire and ambulance vehicles were clearly identified, among other things, thanks to the flashing lights being switched on. Tesla released a software update in September last year, thanks to which the “autopilot” should recognize the vehicles with their distinctive flashing lights even in difficult lighting conditions. The NHTSA then questioned why the update was not declared a recall.
Tesla boss Elon Musk (50) always emphasized that the “autopilot” makes driving safer and helps to avoid accidents. For several months, the company has been letting selected beta testers try out the next version of the software with more functions for city traffic. There are many videos circulating on the web in which the software makes mistakes. The NHTSA has already requested information about the test on public roads.
Since February, the NHTSA has also been investigating Tesla for reports of sudden braking. The trigger was 354 complaints within nine months because the “autopilot” system suddenly and unexpectedly activated the brakes. The authority also requested information from other car manufacturers about their assistance systems.