Tesla vehicle almost gets into an accident in autonomous driving mode. There was an analyst on board.
A test of Tesla’s self-driving system found concerning vulnerabilities, an analyst reports. The Bloomberg Journal reports on an analyst who was testing Teslas “Full Self-Driving” mode (FSD) experienced a dangerous situation.
William Stein, the tester, did praise the vehicle’s ability to respond to road closures, potholes and traffic flow. However, he expressed concern about the laxity of the system: He was surprised that he steering wheel no longer had to touch to keep FSD active. He was even able to continue using the system without looking at the road.
Stein had to intervene several times to prevent accidents. In one case, the Tesla threatened to crash into a vehicle in front. In another situation, the system ignored a police officer’s instruction to slow down because of a passing funeral procession. The vehicle also made an illegal lane change on a winding section of road. He concluded that the FSD system was far from achieving true autonomy.
Tesla Like other manufacturers of autonomous vehicle technologies, has long been criticized Regulatory authorities. In particular, the marketing of self-driving systems is criticized. However, the system was an older version.
These findings underline the need for further developments in the area of autonomous driving. They also illustrate how important it is that drivers always remain attentive and ready to intervene despite advanced assistance systems.