Clean Technica: Tesla FSD Drives Down Train Track, Waymo Gets Stuck In Intersection004081

Robotaxis may be the future, and they are already here now in some places, but there are still issues to resolve. I’ve seen two quite notable problems pop up in recent days with the two most prominent self-driving brands in the US.
Tesla Drives Down A Train Track
First of all, we have the story from Sinking Spring, Pennsylvania. Three people were riding in a Tesla that was in Full Self Driving (FSD) mode. The car, based on its “vision only” self-driving system, determined that some railroad tracks were a roadway, turned left to get on the tracks, and then proceeded to drive along them. Minutes later, the Tesla was struck by a train.
Luckily, the people in the Tesla had the sense to realize something was wrong and get out of the car before it was struck by the train. (Why didn’t they just turn off FSD and drive back to the road themselves? I don’t know. Maybe they didn’t know the best way to drive out, or maybe they heard a train coming and got out of the car as quickly as possible.
“[The Tesla] went down the tracks approximately 40-50 feet,” Western Berks Fire Commissioner Jared Renshaw reported to local news outlet WFMZ.
Naturally, this was not one of Tesla’s test robotaxis. Those are only in Austin, Texas, at the moment, and there are only something like 15 of them. The question is whether FSD is really ready for this stage.
“The National Highway Traffic Safety Administration examined nearly 500 crashes involving Tesla vehicles operating in self-driving mode, 13 of which resulted in fatalities,” Yahoo!News writes. The self-driving Tesla “struck another vehicle or obstacle with adequate time for an attentive driver to respond or mitigate the crash” in 45% of those accidents, according to the NHTSA. In another 31% of cases, “in low traction conditions such as wet roadways,” the Tesla slid off the road. Unsurprisingly, the NHTSA concluded that Tesla drivers in these cases had too much trust in the FSD system. In other words, since it was still the humans’ responsibility to monitor the car’s driving, it was their fault they didn’t take over sooner to avoid a crash, but that appeared to be because they thought FSD was better than it was. Circling back to the train story, one has to wonder why the driver didn’t take over as soon as they noticed the car starting to turn onto the train tracks. Hmm…. Excessive faith in FSD I guess?
Waymo Stops In An Intersection
On the other side of the US, a friend recently shared with me a story of a Waymo robotaxi stopping in the middle of an intersection in San Francisco and just sitting there.

Back to San Francisco, CA for yet another critical safety failure in the middle of an intersection for driverless Robotaxi company Waymo.
As we continue to sort through these videos, the one thing we find over and over again is that Waymo’s are not good with intersections.… pic.twitter.com/d52O0owmDC
— No Safe Words (@Cyber_Trailer) June 29, 2025

Hmm … that’s not good. Though, the passengers are certainly having a good time cracking up about it. As much as they are having fun with it, it’s clearly not a safe scenario.
It’s not clear here why the Waymo was stuck in the middle of the intersection. Right after the video started, you can see another couple of cars driving forward in the lane to the left, but the Waymo is just stopped there. The light is yellow, then turns red, and the Waymo stays in place as cross-traffic starts driving. The Waymo is actually blocking the path of a trolly/tram. Cross-traffic from the left is blocked by the car. Then, a remote Waymo driver takes over the car and starts driving it out of the intersection, even on red. I’m not sure if that’s the preferred solution in that situation either … but that’s how the issue was eventually resolved.
Why is a Waymo in San Francisco still having such a bad edge case like this? Who knows? It’s not clear at all why the robotaxi stopped in the middle of the intersection before the guys in the car started filming.
Are we ready for robotaxis? It depends on who you ask. Are they safer than human drivers, as stats seem to show? Probably, but not in all scenarios, and certainly not all self-driving systems.
See any more “funny” videos or news about robotaxis doing weird things? Share them down below.

Sign up for CleanTechnica’s Weekly Substack for Zach and Scott’s in-depth analyses and high level summaries, sign up for our daily newsletter, and follow us on Google News!

Whether you have solar power or not, please complete our latest solar power survey.

[embedded content] [embedded content]

Have a tip for CleanTechnica? Want to advertise? Want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Sign up for our daily newsletter for 15 new cleantech stories a day. Or sign up for our weekly one on top stories of the week if daily is too frequent.

Advertisement

 

CleanTechnica uses affiliate links. See our policy here.
CleanTechnica’s Comment Policy

Share this story!

Go to Source