/
The 45-minute video was meant to demonstrate v12 of Tesla’s Full Self-Driving but ended up being a list of things not to do while using FSD.
Share this story
Illustration by Laura Normand / The Verge
Elon Musk’s Tesla almost ran a red light while livestreaming a demo of Tesla’s Full Self-Driving (FSD) beta software. Musk was also livestreaming the demo while in the driver’s seat, violating Tesla’s rules for its advanced driver-assist technology. Oh, and he kind of doxxes Mark Zuckerberg, too.
The roughly 45-minute video was meant to demonstrate the prowess of v12 of Tesla’s advanced driver-assist technology, which has yet to be released to customers. And while the vehicle appears to be operating safely for the majority of the trip, it still ends up being a bizarre experience — which is typical of all things Musk.
At around the 19-minute mark, Musk is forced to take the steering wheel as the vehicle tries to accelerate through a red light in Palo Alto. It seemed as if the vehicle misread the traffic signal and tried to proceed through the intersection at the wrong time. Musk posted the grainy video on X, formerly known as Twitter, last Friday.
“So that’s our first intervention because the car should be going straight,” Musk said after taking control of the vehicle. “That’s why we’ve not released this to the public yet.” (FSD is technically a beta software, though Musk has said that v12 will be the first time Tesla removes that label.)
To be sure, the video is noteworthy for other reasons, too. With Musk in the driver’s seat filming on his smartphone, the vehicle is seen driving through several roundabouts and construction zones with relative ease. Musk explains that v12 of FSD will be the first time the feature will be “entirely AI and cameras.” Tesla’s approach to self-driving technology deviates from most other companies, which use a variety of sensors, including lidar and radar, while Tesla only uses cameras.
But the moment when Musk was forced to intervene at the traffic light has already been seized upon by critics who say Tesla’s approach to autonomous driving is insufficient and reckless.
Musk is also in violation of Tesla’s own rules about how drivers should behave while using FSD. By filming the drive himself from the driver’s seat and also interacting with Twitter commenters during the drive, Musk is ignoring his own company’s guidelines that advise drivers to keep their hands on the steering yoke at all times. According to Tesla’s handbook:
Full Self-Driving (Beta) is a hands-on feature. Keep your hands on the steering yoke (or steering wheel) at all times, be mindful of road conditions and surrounding traffic, and always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.
Musk has said that FSD is being tested as beta software to emphasize the need for drivers to pay attention to the road while using the driver-assist feature. But some have noted that the beta label could allow Tesla to avoid legal liability in the case of a crash.
The video is also poor quality and often grainy. It regularly flips between vertical and horizontal filming. And Musk frequently comments that he hopes someone can edit the footage to make it more interesting.
At around the 27-minute mark, Musk claims he is going to drive to Meta CEO Mark Zuckerberg’s house, which he has previously threatened to do as part of their much-publicized (but probably will never happen) fight.
Musk Googles Zuckerberg’s address and then displays it prominently on-screen. (Remember, Musk has banned the @ElonJet account that tracks his private jet from X/Twitter, claiming it was a “direct personal safety risk” to him.)
“This cannot be considered doxxing if you just google it,” Musk says.
The broader context here is that the federal government’s two-year investigation into Tesla’s highway driver-assist feature, Autopilot, is nearing its end, which may have prompted Musk to post the video as provocation.
The day before Musk livestreamed his drive, Reuters reported that the National Highway Traffic Safety Administration was planning on resolving its investigation into over a dozen crashes involving Tesla vehicles using Autopilot crashing into stationary emergency vehicles. The government could force a recall of Autopilot and, by extension, FSD, which could affect Tesla’s valuation, much of which hinges on the company’s promise that it will offer full autonomy to its customers in the near future.