Tuesday, February 24, 2026

Tesla's Not-A-Robotaxi Service

Source
I have now seen the fabled CyberCab three times in real life. It has two seats, one of them fully equipped with human driver interface equipment. In each case a human was using them to drive the car, which is necessary in California because Fake Self-Driving is a Level 2 driver assistance system that requires a human behind the wheel at all times. A Robotaxi that requires a human driver and can carry at most one passenger isn't going to be a economic success.

Fred Lambert has two posts illustrating the distance between Musk's claims and reality. Below the fold I look at both of them:

"Safety monitors" less safe than "drivers"

First, Tesla ‘Robotaxi’ adds 5 more crashes in Austin in a month — 4x worse than humans:
Tesla has reported five new crashes involving its “Robotaxi” fleet in Austin, Texas, bringing the total to 14 incidents since the service launched in June 2025. The newly filed NHTSA data also reveals that Tesla quietly upgraded one earlier crash to include a hospitalization injury, something the company never disclosed publicly.
Even before they were changed, we knew very few of the details:
As with every previous Tesla crash in the database, all five new incident narratives are fully redacted as “confidential business information.” Tesla remains the only ADS operator to systematically hide crash details from the public through NHTSA’s confidentiality provisions. Waymo, Zoox, and every other company in the database provide full narrative descriptions of their incidents.
But what we do know isn't good:
With 14 crashes now on the books, Tesla’s “Robotaxi” crash rate in Austin continues to deteriorate. Extrapolating from Tesla’s Q4 2025 earnings mileage data, which showed roughly 700,000 cumulative paid miles through November, the fleet likely reached around 800,000 miles by mid-January 2026. That works out to one crash every 57,000 miles.
The numbers aren't just not good, they're apalling:
By the company’s own numbers, its “Robotaxi” fleet crashes nearly 4 times more often than a normal driver, and every single one of those miles had a safety monitor who could hit the kill switch. That is not a rounding error or an early-program hiccup. It is a fundamental performance gap.
There are two points that need to be made about how bad this is:
  • However badly, Tesla is trying to operate a taxi service. So it is misleading to compare the crash rate with "normal drivers". The correct comparison is with taxi drivers. The New York Times reported that:
    In a city where almost everyone has a story about zigzagging through traffic in a hair-raising, white-knuckled cab ride, a new traffic safety study may come as a surprise: It finds that taxis are pretty safe.

    So are livery cars, according to the study, which is based on state motor vehicle records of accidents and injuries across the city. It concludes that taxi and livery-cab drivers have crash rates one-third lower than drivers of other vehicles.
    A law firm has a persuasive list of reasons why this is so. So Tesla's "robotaxi" is actually 6 times less safe than a taxi.
  • Fake Self Driving is a Level 2 system that requires a human behind the wheel, and that is the way Tesla's service in California has to operate. But in Austin the human is in the passenger seat, or in a chase car. Tesla has been placing bystanders at risk by deliberately operating in a way that it knows, and the statistics it reports show, is unsafe.

Tesla's Catch-22

Second, Tesla admits it still needs drivers and remote operators — then argues that’s better than Waymo:
Tesla filed new comments with the California Public Utilities Commission that amount to a quiet admission: its “Robotaxi” service still relies on both in-car human drivers and domestic remote operators to function. Rather than downplaying these dependencies, Tesla leans into them — arguing that its multi-layered human supervision model is more reliable than Waymo’s fully driverless system, pointing to the December 2025 San Francisco blackout as proof.

The filing, submitted February 13 in CPUC Rulemaking 25-08-013, reveals the massive operational gap between what Tesla calls a “Robotaxi” and what Waymo actually operates as one.
Tesla's filing admits that the service they market as a "robotaxi" really isn't one:
Tesla operates its service using TCP (Transportation Charter Party) vehicles equipped with FSD (Supervised), a Level 2 ADAS system that, by definition, requires a licensed human driver behind the wheel at all times, actively monitoring and ready to intervene.

On top of that in-car driver, Tesla describes a parallel layer of remote operators. The company states it employs domestically located remote operators in both Austin and the Bay Area, and that these operators are subject to DMV-mandated U.S. driver’s licenses, “extensive background checks and drug and alcohol testing,” and mandatory training. Tesla frames this as a redundancy system, remote operators in two cities backing up the in-car drivers.

That’s two layers of human supervision for a service Tesla markets as a “Robotaxi.”
Compare this with a Waymo:
Waymo’s vehicles have no driver in the car. Waymo uses remote assistance operators who can provide guidance to vehicles in ambiguous situations, but the vehicle drives itself. Waymo’s remote operators don’t control the car, they confirm whether it’s safe to proceed in edge cases like construction zones or unusual road conditions.

... Tesla’s system requires a human to drive the car and has remote operators as backup. Waymo’s system drives itself and has remote operators as backup. Tesla is essentially describing a staffing-intensive taxi service with driver-assist software. Waymo is describing an autonomous transportation network.
This is where Tesla's marketing their service as a "robotaxi" creates a Catch-22:
Tesla argues forcefully that its Level 2 ADAS vehicles should remain outside the scope of this AV rulemaking entirely, agreeing with Lyft that they aren’t “autonomous vehicles” under California law.

At the same time, Tesla is fighting Waymo’s proposal to prohibit Level 2 services from using terms like “driverless,” “self-driving,” or “robotaxi.” Tesla calls this proposal “wholly unnecessary,” arguing that existing California advertising laws already cover misleading marketing.
But note that:
A California judge already ruled in December 2025 that Tesla’s marketing of “Autopilot” and “Full Self-Driving” violated the state’s false advertising laws.
So here is the Catch-22:
Tesla is telling regulators its vehicles are not autonomous and require human drivers, while simultaneously fighting for the right to keep calling the service a “Robotaxi.” Tesla wants the legal protections of being classified as a supervised Level 2 system and the marketing benefits of sounding like a fully autonomous one.
Sadly, this is just par for the course when it comes to Tesla's marketing. Essentially everything Elon Musk has said about not just the schedule but more importantly the capabilities of Fake Self Driving has been a lie, for example a 2016 faked video. These lies have killed many credulous idiots, but they have succeeded in pumping TSLA to a ludicrous PE ratio because of the kind of irresponsible journalism Karl Bode describes in The Media Can't Stop Propping Up Elon Musk's Phony Supergenius Engineer Mythology:
One of my favorite trends in modern U.S. infotainment media is something I affectionately call "CEO said a thing!" journalism.

"CEO said a thing!" journalism generally involves a press outlet parroting the claims of a CEO or billionaire utterly mindlessly without any sort of useful historical context as to whether anything being said is factually correct.

There's a few rules for this brand of journalism. One, you can't include any useful context that might shed helpful light on whether what the executive is saying is true. Two, it's important to make sure you never include a quote from an objective academic or expert in the field you're covering that might challenge the CEO.
After all, if a journalist does include an expert pointing out that the CEO is bullshitting:
statements produced without particular concern for truth, clarity, or meaning
the journalist will lose the access upon which his job depends. But I'm not that journalist, so here is my list of the past and impending failures of the "Supergenius Engineer":
Contrast these with the successes:
  • Tesla's cars: Wikipedia notes that:
    Tesla was incorporated in July 2003 by Martin Eberhard and Marc Tarpenning as Tesla Motors. ... In February 2004, Elon Musk led Tesla's first funding round and became the company's chairman, subsequently claiming to be a co-founder
    Starting in 2008, Franz von Holzhausen designed the Model S, which launched in 2012 and was Tesla's first success. Initially, Tesla was a great success, but it has failed to update its line-up. It is now far behind Chinese EV manufacturers and losing market share worldwide. They will lose the US market share once the Chinese set up US factories.
  • Space X Falcon 9: Musk's insight that re-usability would transform the space business was a huge success. It was thanks to significant government support and a great CEO, Gwynne Shotwell.
This history seems like valuable context for journalists to include in reports of Musk's next pronouncement.

No comments: