Tesla’s “Autopilot” driving assistance system has become the target of new investigations by the US traffic regulator. The focus is on the advanced version “Full Self Driving”. Over time, Tesla’s electric cars should drive autonomously – but people behind the wheel should still intervene in the event of errors. The new investigation involves, among other things, cases in which the software steered cars into the oncoming lane or allowed them to drive through intersections despite red lights.
The NHTSA is investigating 58 incidents in which 23 people were injured. There were no deaths. Just a few weeks ago Tesla averted a lawsuit after a fatal accident with “Autopilot” activated. It was about an accident in 2019 when a 15-year-old was killed.
Drivers must intervene
Tesla lets drivers in USA have been testing FSD software on public roads for some time now. Especially at the beginning, they published a lot of videos in which the system sometimes made serious errors and the people behind the wheel had to intervene to prevent accidents. According to the Tesla boss Elon Musk (54), the software became much better with new versions.
However, there is good news for the company China. Tesla sales there increased by 2.8 percent in September compared to the same month last year.