On February 15, 2023, NHTSA issued a recall notice for nearly 400,000 Tesla vehicles equipped with the so-called “Full Self Driving” (Beta) driver-assist system. In it, the agency said, “The FSD Beta system may allow the vehicle to act unsafe around intersections, such as traveling straight through an intersection while in a turn-only lane, entering a stop sign-controlled intersection without coming to a complete stop, or proceeding into an intersection during a steady yellow traffic signal without due caution. In addition, the system may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of the vehicle’s speed to exceed posted speed limits.”
Tesla says it has an over-the-air update ready for its Full Self Driving Beta software, which is designated as version 11.3.2 (2022.45.11). The update involves an enormous number of changes designed to address the concerns NHTSA identified in its recall notice. The new update addresses them all and adds several other changes. For full details on the release notes for this update, they can be found at the Not A Tesla App website.
Tesla Full Self Driving Beta Update Details
According to Yahoo!, the updates merge the FSD Beta software stack with the Autopilot highway software stack, which should improve the operation of Autopilot, which has reportedly not had a significant update in the past four years. In addition, the version 11 release will address issues like how the system responds when other drivers pull in front of a Tesla when FSD is active, how the software positions the car in wide lanes, how the system responds to a lane blockage in the road ahead, how the car changes lanes and turns, and how the car communications with the driver.
The release also addresses the larger safety issue. In the release notes, Tesla says, “In accordance with a recent recall (campaign #23V085 for U.S. and #2023-063 for Canada), Tesla is making improvements to the following specific behaviors within FSD Beta” and then lists a number of items addressed:
Improved decision logic to proceed through or stop at certain yellow lights by modeling the decision as a tradeoff framework that considers estimated: deceleration required to stop, time to enter and exit the intersection, and the distance traversed across the intersection before the light transitions to red. This should make yellow light handling more natural and human-like.
Improved the longitudinal slowdown control profile when leading up to stop sign intersections to make the overall maneuver feel more perceptible and natural.
Improved the Tesla’s speed adjustment when entering certain speed zones by allowing for earlier control for detected speed limit signs. The assertiveness of the response when slowing down for detected speed limit signs is determined by the current speed and its difference to the speed indicated by the detected sign. Added a visual glow behind the speed limit icon on the user-interface to alert the driver when the vehicle’s set speed exceeds the detected speed limit by more than 50%. Finally, the option for an absolute Speed Limit offset in FSD Beta was removed; only the percent-based offset will be available.
Updated the behavior for certain scenarios where the Tesla may maneuver from a turn lane to continue traveling straight. These maneuvers will now be treated as a lane change, where the turn indicator is used to alert other drivers of the Tesla’s intent.
Then, of course, Tesla issued its standard reminder that the driver is always responsible for vehicle operation even when FSD Beta is engaged. “You must constantly supervise the road, keep your hands on the wheel and be ready to intervene to maintain safety,” the company says.
Drivers will notice that their touchscreen looks different after the update is installed, as Tesla has reorganized its appearance and how it functions. The changes are extensive and include new ways for the car to communicate with the driver that are expected to be more intuitive and easier to understand.
Tesla And The Mind Of Musk
FSD edge case? Credit: Fritz Hasler
While the new update is filled with promise, we will have to wait to get feedback from owners before we know how successful it is at addressing complaints drivers have had with the system. Our own Fritz Hasler has had issues with the system, which he has dutifully reported on regularly. Two other members of the CleanTechnica team own Teslas with FSD and have been less than thrilled with its performance, to the point where they have both stopped using it.
Tesla, of course, does not deign to respond to any press inquiries, as it has too much important stuff to do to waste time with silly journalists and their niggling questions. Undaunted, the Washington Post did an in-depth story about Full Self Driving last Sunday. What was most striking about that article is the number of times people who spoke with the Post asked not to be identified for fear of reprisals. Tesla is run like a third-world dictatorship and punishes those who do not toe the company line. Every organization is a reflection of its leaders and, in this case, that means Elon Musk.
In May of 2021, Tesla announced that it was eliminating the use of radar in its cars. Shortly thereafter, it began deactivating the radar units installed in cars already manufactured. [Over-the-air updates are a two-way street. What can be enabled wirelessly can also be disabled wirelessly.]
Some Tesla engineers were aghast, sources told the Washington Post. They contacted a trusted former executive for advice on how to talk Musk out of it. They feared that without radar, the cars would be susceptible to basic perception errors if the cameras were obscured by raindrops or even bright sunlight — problems that could lead to crashes. [Editor’s note: Those two conditions sometimes cause my Tesla to say it cannot engage Autopilot/FSD. I am not at all surprised to read about their concerns. —Zach] Musk was unconvinced and overruled his engineers. Once radar was eliminated, according to interviews with nearly a dozen former employees, test drivers, safety officials, and other experts, there was an increase in crashes and near misses.
They said Musk’s erratic leadership style forced them to work at a breakneck pace to develop the technology and to push it out to the public before it was ready. Some said they are worried that, even today, the software is not safe to be used on public roads. Most spoke on the condition of anonymity for fear of retribution.
“The system was only progressing very slowly internally” but “the public wanted a product in their hands,” said John Bernal, a former Tesla test operator who worked in the Autopilot department. He was fired in February of 2022 after he posted videos online of FSD in action (or not in action as the case may be). “Elon keeps tweeting, ‘Oh we’re almost there, we’re almost there,’” Bernal said. But, “internally, we’re nowhere close, so now we have to work harder and harder and harder.” The team has also bled members in recent months, including senior executives, he said.
Devil Or Angel
The debate over whether Musk is a devil or an angel is one that always draws strong responses from people. Some see Musk as a god-like figure, citing his incredible wealth as some sort of litmus test of his genius. Others see him as an autocratic, domineering, inflexible tyrant who rules with an iron fist and brooks no dissent.
Those of us who have been following Tesla for more than a decade remember when Mobileye was behind the newly introduced Autopilot system. Then Joshua Brown died on a Florida highway when his Model S failed to “see” a tractor trailer crossing the road. After that, Musk had a public hissy fit, and a very messy divorce between Tesla and Mobileye ensued. Shortly thereafter, Tesla made radar the primary sensor and relegated cameras to a lesser status in the Autopilot hierarchy.
Musk was effusive about the capabilities of radar and waxed eloquent about how Tesla could now bounce radar signals under a vehicle in front to detect what was going on ahead of it. There was nothing radar couldn’t do then … but now it has been eliminated entirely, meaning there is no backup system to help the cameras make sense of the external world.
Some may recall the biography of Musk written by Ashlee Vance that describes how, when he was the head honcho at PayPal, he excoriated workers who didn’t want to work until 3:00 am. That is a common refrain we hear from Musk. Even when he took over Twitter, he made big noise about only wanting engineers who are “hardcore” and willing to work 18 hour shifts or longer. If you want to work for Elon, you need to be willing to put the rest of your life on hold and focus on his needs to the exclusion of all else.
Kicking this around in the breakfast bar over avocado toast this morning, some members of the crack CleanTechnica editorial staff wondered aloud if we should feel reassured that Tesla engineers working days and weeks on end with little sleep are who we want writing code that serves the needs of the Tesla drivers, but also those non-Tesla drivers who share the road around them who do not know they are part of a grand self-driving experiment known as Tesla Full Self Driving version 11.3.2. We know our readers have strong opinions on this topic and can’t wait to read your comments.
I don’t like paywalls. You don’t like paywalls. Who likes paywalls? Here at CleanTechnica, we implemented a limited paywall for a while, but it always felt wrong — and it was always tough to decide what we should put behind there. In theory, your most exclusive and best content goes behind a paywall. But then fewer people read it! We just don’t like paywalls, and so we’ve decided to ditch ours. Unfortunately, the media business is still a tough, cut-throat business with tiny margins. It’s a never-ending Olympic challenge to stay above water or even perhaps — gasp — grow. So …