Cleantechnica: California Bans Deceptive Self-Driving Claims 002482

False self-driving claims have been outlawed in California by Senate Bill 1398, which was recently signed into law by Governor Gavin Newsom. It was introduced by Lena Gonzalez, a Democrat from Long Beach, who writes in her summary of the new legislation, “Senate Bill 1398 increases consumer safety by requiring dealers and manufacturers that sell new passenger vehicles equipped with a semi-autonomous driving assistance feature or provides any software update or vehicle upgrade that adds a semi-autonomous driver assistance feature to give a clear description of the functions and limitations of those features. Further, SB 1398 prohibits a manufacturer or dealer from deceptively naming, referring to, or marketing these features.”
Well, golly. Who do you suppose is the target of the new law? If you said “Tesla,” go to the head of the class! According to GovTech, a feature of the San Francisco Chronicle, the new law prohibits California dealers and manufacturers from “deceptively naming or marketing” a car as self-driving if it’s only equipped with partial automation features that still require human drivers to pay attention and handle driving chores themselves.
The California Department of Motor Vehicles, which regulates autonomous vehicles, already had rules banning the false advertisement of self-driving cars. However, Gonzalez told the Los Angeles Times in August that the DMV’s lack of enforcement prompted her and other state legislators to advance the bill in order to make the ban part of state law. According to a legislative analysis of the new law, Waymo, one of the companies the state permits to test and operate autonomous vehicles, stopped describing its cars as self-driving in 2021, citing confusion among drivers caused by Tesla’s advertising.
The legislative summary continues, “Many manufacturers offer level 2 features and promote them as a selling point. Hence, it is easy for the average consumer to believe that based on unclear naming, advertising, or marketing, they are purchasing a vehicle with fully autonomous features when the vehicle can only perform functions similar to autopilot or cruise control.
“When a consumer purchases a vehicle, vehicle upgrade, or software update they believe is changing the automation level of their vehicle, the consumer may pay less attention to monitoring the vehicle while operating or may use the feature in an unsafe and unintended way. This can have dangerous consequences, including increased accidents on California roads or death.”
According to Autoblog, the Tesla Autopilot function is closer in operation to advanced adaptive cruise control than it is to a system that could indefinitely control the vehicle without driver input. While the new legislation doesn’t name or target Tesla directly, it is the leading EV producer in the United States. Its advanced technologies have caused more than their fair share of controversy and been involved in some notable crashes.
[embedded content]
Traffic on the Oakland Bay Bridge was halted on Thanksgiving Day after a Tesla said to be operating in Full Self-Driving mode slowed abruptly, causing a number of chain reaction collisions that injured 18 people. The driver told authorities that his 2021 Tesla Model S was using the company’s “Full Self-Driving” software, according to a Highway Patrol report obtained by CNN and reported by KRON. In his statement, he said he was driving at 55 mph when the car suddenly swerved into the left lane and slowed to around 20 mph. That in turn caused several cars behind to slam into the back of the Tesla
NHTSA has opened investigations into Tesla’s driver assistance features after multiple crashes, several of which were fatal. What is especially troubling is that while Tesla usually claims that Autopilot or FSD were not active at the time of those crashes, there are suggestions in the press that those systems can disengage as little as one second before a collision, giving drivers no time to reassert control over their vehicles. That has not been substantiated as of yet, but may become part of the NHTSA investigation. (In Tesla’s own “Safety Reports” reporting quarterly accident data with Autopilot on or not, Tesla writes, “To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed.” But that’s a separate matter from assessing or giving statements about accidents on a case by case basis. Tesla inexplicably stopped publishing accident data after Q4 2021.)
Full Self-Driving Lawsuit
In September, a number of Tesla owners filed a class action lawsuit against Tesla, claiming the company has yet to produce a fully self-driving car. The law firm representing them says Tesla owners receiving the latest “updates” to Tesla’s Autopilot software and FSD beta software have reported myriad problems, such as cars having difficulty making routine turns, running red lights, and steering into oncoming traffic. There have reportedly also been numerous collisions involving Tesla’s purportedly cutting-edge software, including vehicles crashing at high speeds into large stationary objects such as emergency vehicles and an overturned box truck.
The complaint also alleges that through federal regulatory investigations it has become increasingly clear that Tesla knew all along that its years of statements regarding Autopilot and FSD technology were deceptive and misleading, but it made the statements anyway to hype the company’s vehicles and technology, increase sales, and establish Tesla as a dominant player in the electric vehicle market.
Tesla is contesting the suit, claiming that all Tesla owners consent to resolve legal disputes against the company in binding arbitration rather than by legal action. No doubt all you Tesla owners out there remember ticking the little box at the time you ordered your car that said you agreed to give up your legal rights in the event there was an issue between you and the company, right?
The judge in this case, however, has yet to rule on Tesla’s request to dismiss the case. First year law students may remember from their Contracts classes that courts can invalidate agreements in which one party has all the negotiating power, calling them contracts of adhesion that do not reflect an actual agreement agreed to by two parties with equal opportunity to determine its terms.
The Takeaway

This is the point where Tesla fanboys and Tesla haters usually part company. It seems intuitively obvious to the most casual observer that the term “Full Self-Driving” clearly implies a car equipped with that software is capable of driving itself. What else could it possibly mean?* Tesla likes to say it warns drivers to be alert at all times, but why? If a car has Full Self-Driving, why would driver interventions be necessary? Using the term and arguing it means drivers must always have a hand on the wheel and be ready to take control at a moment’s notice is logically inconsistent. [*Editor’s note: To clarify, when people buy the Full Self-Driving package, they are paying for the hardware that should allow for full self-driving in the future and complimentary software updates that go in that direction and, theoretically, eventually provide full self-driving capability. I think that is clear, but it does appear that the terminology still confuses some people. I just don’t think that includes people who actually buy the package, because there are numerous warnings of all sorts for the driver reminding you that what you paid for is not yet in its final form. —Zach Shahan]
There is another dynamic at work here. What of all the other drivers on the road? Aren’t they entitled to know that the Tesla near them is using technology that purports to make self-driving possible? Don’t those drivers who crashed into the back of that Tesla on the Oakland Bay Bridge have a right to be warned that they are unwittingly participating in some experimental beta testing of a technology that is still under development?
The arrogance of Tesla, which starts at the very top, is stunning. It gives things fancy names that sound good on Twitter then complain when people sue them for not living up to the promises the company made. It’s a “heads we win, tails you lose” Catch-22 situation where Tesla gets to say and do anything it wants with no legal consequences for its actions. It will be interesting to see how the company alters the way it promotes Autopilot and Full Self-Driving as the result of the new California law.

Complete our 2022 CleanTechnica reader survey for a chance to win an electric bike.

Appreciate CleanTechnica’s originality and cleantech news coverage? Consider becoming a CleanTechnica Member, Supporter, Technician, or Ambassador — or a patron on Patreon.

Don’t want to miss a cleantech story? Sign up for daily news updates from CleanTechnica on email. Or follow us on Google News!

Have a tip for CleanTechnica, want to advertise, or want to suggest a guest for our CleanTech Talk podcast? Contact us here.

Advertisement

 

Go to Source