Cybertruck Gets FSD, Tries to Drive Onto Median in the Middle of Sunset Boulevard

Tesla has started rolling out its controversial and erroneously-named “Full Self-Driving” software to its Cybertruck — and what could possibly go wrong?

A small number of testers have now received an over-the-air software update enabling a “Supervised” version of the driver assistance package. By enabling it, the system allows their 6,600-pound pickup trucks to take care of most of the driving, including city streets and complex intersections.

But as the Tesla fan behind the X-formerly-Twitter account Whole Mars Catalog found out first-hand, the software — which is still in an unfinished “Early Access” state but can still be experimented with on public streets — is far from perfect.

A video shared by the account on YouTube today shows the truck turning left onto Sunset Boulevard in Los Angeles. But the driver is quickly forced to intervene to stop the Cybertruck from rolling right into the median strip.

“Not so beautiful after all,” the driver said, shortly after swerving into the correct lane, correcting himself seconds after praising the experience. “So it was gonna drive onto the median.”

The close call highlights the sheer dangers of testing out the flawed software in public. We’ve already seen our fair share of run-ins involving FSD — and the Cybertruck won’t be any different, even if it is heavier and with strikingly sharp edges.

Meanwhile, Tesla CEO Elon Musk has essentially bet the entire fate of the EV maker on the software and the development of a so-called “robotaxi” — so the company’s unconvincing efforts to realize a fully self-driving car could be a worrying sign of even more trouble in the future.

Cybertruck Full Self-Driving (Supervised) Early Access Release 1

Despite many years of development and plenty of overly optimistic predictions by Tesla’s CEO Elon Musk, FSD still requires the driver to intervene at any time, making the experience choppy at best.

The driver assistance software has already proven quite dangerous. Regulators have previously found that Autopilot and FSD, which have been linked to hundreds of crashes, can lull drivers into a false sense of security.

Just last week, a team of researchers at the independent firm AMCI Testing drove a Tesla in FSD mode for over 1,000 miles, but had to intervene over 75 times, or roughly every 13 miles.

“What’s most disconcerting and unpredictable is that you may watch FSD successfully negotiate a specific scenario many times — often on the same stretch of road or intersection — only to have it inexplicably fail the next time,” AMCI director Guy Mangiamele said in a statement.

Other than having to avoid a potential disaster in the middle of Sunset Boulevard, the rest of Whole Mars Catalog’s test drive appears to have been far less eventful.

“There are a few more interventions that you wouldn’t see on the Model 3,” the driver in the video said at one point. “The Model 3 probably would’ve nailed that.”

And separate left turn onto a one-way street was “handled beautifully.”

“I have to say overall I’m impressed, I expected it to be a lot worse,” the driver said. “I wouldn’t hesitate to take this over the Model 3 on a long drive.”

“Overall, not perfect, almost drove onto the median on Sunset Boulevard,” he concluded.

It’s still unclear when the rest of the Cybertrucks currently on the road will get access to the $8,000 add-on. The company is gearing up for the release of Version 13, which, according to Tesla will mark a substantial improvement as far as “necessary interventions” are concerned.

But given the company and its CEO’s track record, there’s a good chance Cybertruck owners will have to remain patient. Whether that’s a good or a bad thing for anybody happening to share the road with them remains to be seen.

More on Tesla: Cybertruck Owner Says Windshield Shattered When Wiped With a Microfiber Cloth

Share This Article

Go to Source