The previous post in this series,
Elon Musk: Threat or Menace Part 3, was based on the impressively detailed reporting from a team at the
Washington Post on the crash that killed Jeremy Banner in
The final 11 seconds of a fatal Tesla Autopilot crash. The team's subsequent equally detailed
Tesla worker killed in fiery crash may be first ‘Full Self-Driving’ fatality triggered
this comment which concluded:
It seems the driver thought that it was OK to drive home with a blood alcohol level of 0.26 because he believed Musk's hype that Fake Self Driving would handle it despite having to repeatedly override it on the way out.
Now, the team's Faiz Siddiqui and Trisha Thadani are out with
In 2018 crash, Tesla’s Autopilot just followed the lane lines. Below the fold I look into what it reveals about Autopilot.
The article is based upon depositions in a
trial about to start:
The case involves a fatal crash in March 2018, when a Tesla in Autopilot careened into a highway barrier near Mountain View, Calif., after getting confused by what the company’s lawyers described in court documents as a “faded and nearly obliterated” lane line.
The driver, Walter Huang, 38, was killed. An investigation by the National Transportation Safety Board later cited Tesla’s failure to limit the use of Autopilot in such conditions as a contributing factor: The company has acknowledged to National Transportation Safety Board that Autopilot is designed for areas with “clear lane markings.”
Musk's and Tesla's marketing hype conflict with the
deposition:
Under oath, however, Tesla engineer Akshay Phatak last year described the software as fairly basic in at least one respect: the way it steers on its own.
“If there are clearly marked lane lines, the system will follow the lane lines,” Phatak said under questioning in July 2023. Tesla’s groundbreaking system, he said, was simply “designed” to follow painted lane lines.
...
In his deposition, Phatak said Autopilot will work wherever the car’s cameras detect lines on the road: “As long as there are painted lane lines, the system will follow them,” he said.
In this case,
it did:
Huang, an engineer at Apple, bought his Tesla Model X in fall 2017 and drove it regularly to work along U.S. Highway 101, a crowded multilane freeway that connects San Francisco to the tech hubs of Silicon Valley. On the day of the crash, his car began to drift as a lane line faded. It then picked up a clearer line to the left — putting the car between lanes and on a direct trajectory for a safety barrier separating the highway from an exit onto State Route 85.
Huang’s car hit the barrier at 71 mph, pulverizing its front end, twisting it into unrecognizable heap. Huang was pronounced dead hours later, according to court documents.
...
In the months preceding the crash, Huang’s vehicle swerved in a similar location eleven times, according to internal Tesla data discussed by Huang’s lawyers during a court hearing last month. According to the data, the car corrected itself seven times. Four other times, it required Huang’s intervention. Huang was allegedly playing a game on his phone when the crash occurred.
It has been evident for a long time that just following the lines doesn't
live up to the hype:
For years, Tesla and federal regulators have been aware of problems with Autopilot following lane lines, including cars being guided in the wrong direction of travel and placed in the path of cross-traffic — with sometimes fatal results. Unlike vehicles that are designed to be completely autonomous, like cars from Waymo or Cruise, Teslas do not currently use sensors such as radar or lidar to detect obstacles. Instead, Teslas rely on cameras.
As usual, Tesla's response to the crash was to
do as little as possible:
After the crash that killed Huang, Tesla told officials that it updated its software to better recognize “poor and faded” lane markings and to audibly alert drivers when vehicles might lose track of a fading lane. The updates stopped short of forcing the feature to disengage on its own in those situations, however. About two years after Huang died, federal investigators said they could not determine whether those updates would have been sufficient to “accurately and consistently detect unusual or worn lane markings” and therefore prevent Huang’s crash.
The most important thing for Tesla is never to remind the driver of the limitations of their software because doing so would exacerbate the fall in the stock price, currently down 57% from its peak. As I wrote in
Autonomous Vehicles: Trough of Disillusionment:
Elon Musk famously claimed that Tesla is worth zero without Full Self Driving. But although this is typical Musk BS, ... unlike some other utterances it contains a kernel of truth. Tesla is valued as a technology company not a car company. Thus it is critical for Telsa that its technology be viewed as better than those of other car companies; anything that suggests it is limited or inadequate is a big problem not just for the company but also for Musk's personal wealth.
Liam Denning describes the problem for Musk if
doubts emerge about the AIs driving Teslas:
Tesla is, overwhelmingly, a maker of electric vehicles, combining high growth with high margins — until recently anyway. Deliveries increased by 38% in 2023 — below the company’s long-term target of 50% per year — and the consensus for 2024 implies just 21%. Trailing 12-month net profit as of the third-quarter was actually down, year over year.
Yet in the most starry-eyed Wall Street financial models, the making and selling of vehicles — generating 92% of Tesla’s current gross profit — accounts for only a fraction of Tesla’s purported valuation. The rest relates to whatever Tesla’s next big thing might turn out to be, usually something related to artificial intelligence, be it robotaxis, licensed self-driving systems, the Optimus humanoid robot or just something else that might spring from the company’s Dojo supercomputing project.
Amorphous as the narrative may be, remove it and the tenuous tether between Tesla’s valuation and something approximating a potential future reality evaporates entirely.
In
The Biggest AI Hype Fraud of All Time Michael Spencer writes:
Tesla's FSD costs have tripled since 2019, costing more than $15,000 in the United States. This pumped up, fraudulently, Tesla’s margins on selling vehicles, however Elon Musk’s promises did not come to fruition after many deadlines have passed.
Spencer notes that "
desperation at Tesla is very noticeable in 2024":
In a push for end-of-quarter sales, Musk recently mandated that all sales and service staff install and demo FSD for customers before handing over the keys.
...
In a recent April 5th Tweet on X, Elon Musk says full level 5 FSD is coming in August, 2024. Tesla’s stock so far in 2024 is down 33%.
He focuses on Musk's
pivot to x.AI:
The myth that Tesla is a technology or AI company has been very crucial in the false promise marketing around the brand. Elon Musk’s weird response to this failure in 2024 is to poach AI talent from his Tesla to his own x.AI company.
This is because x.AI plans to do a huge $3 Billion funding round that would value the AI startup at $18 Billion. This is all more or less breaking news.
The problem is AI frauds have a habit of big declines. Elon Musk may have to make his SpaceX company, valued at around $180 billion as of early 2024, go public with an IPO to raise the funds needed to support his X Corp empire.
Maintaining the illusion of superior technology requires
leaps of logic:
Since 2017, officials with NTSB have urged Tesla to limit Autopilot use to highways without cross traffic, the areas for which the company’s user manuals specify Autopilot is intended. Asked by an attorney for Huang’s family if Tesla “has decided it’s not going to do anything” on that recommendation, Phatak argued that Tesla was already following the NTSB’s guidance by limiting Autopilot use to roads that have lane lines.
Note how, in Tesla's world, any "roads that have lane lines" are "highways without cross traffic", and that Tesla is
not limiting Autopilot's use but
asking their customers to limit its use. A significant difference.
And Musk's
reality distortion field is in
full effect:
When asked whether Autopilot would use GPS or other mapping systems to ensure a road was suitable for the technology, Phatak said it would not. “It’s not map based,” he said — an answer that diverged from Musk’s statement in a 2016 conference call with reporters that Tesla could turn to GPS as a backup “when the road markings may disappear.” In an audio recording of the call cited by Huang family attorneys, Musk said the cars could rely on satellite navigation “for a few seconds” while searching for lane lines.
This casual attitude to operating in the real world is
typical of Tesla:
Phatak’s testimony also shed light on other driver-assist design choices, such as Tesla’s decision to monitor driver attention through sensors that gauge pressure on the steering wheel. Asked repeatedly by the Huang family’s lawyer what tests or studies Tesla performed to ensure the effectiveness of this method, Phatak said it simply tested it with employees.
Given Musk's notorious hair-trigger
firings in response to disagreement, testing with employees is pretty much guaranteed to discover that the system performs almost perfectly.
The
Washington Post team points out that this poor engineering of life-critical systems has
real-world impacts:
Tesla’s heavy reliance on lane lines reflects the broader lack of redundancy within its systems when compared to rivals. The Post has previously reported that Tesla’s decision to omit radar from newer models, at Musk’s behest, culminated in an uptick in crashes.
Whereas other companies
behave responsibly:
Other Tesla design decisions have differed from competitors pursuing autonomous vehicles. For one thing, Tesla sells its systems to consumers, while other companies tend to deploy their own fleets as taxis. It also employs a unique, camera-based system and places fewer limits on where the software can be engaged. For example, a spokesperson for Waymo, the Alphabet-owned self-driving car company, said its vehicles operate only in areas that have been rigorously mapped and where the cars have been tested in conditions including fog and rain, a process known as “geo-fencing.”
“We’ve designed our system knowing that lanes and their markings can change, be temporarily occluded, move, and sometimes, disappear completely,” Waymo spokeswoman Katherine Barna said.
So that's all there is to Autopilot. No radar, no lidar, no GPS, no map, no geofencing, no proper driver monitoring. It just uses the camera to follow the lines. It doesn't disengage if it can't see the lines, it just keeps going. So much for Tesla's vaunted AI capabilities! I wonder how much more you get for the $15K extra you pay for Fake Self Driving?
Tesla's stock is down almost 50% from its all-time high, and the news just keeps getting worse:
ReplyDelete1) Julia Marnin reports that Toddler was able to start ‘defective’ Tesla and crashed into pregnant mom, lawsuit says:
"A woman bought a new Tesla Model X in 2018, believing it was the “safest SUV on the market” as the company had advertised, while pregnant with her second child, according to a lawsuit. The vehicle, however, proved “defective” when Mallory Harcourt’s 2-year-old son was able to start the car, causing it to jolt forward and hit her at her home in Santa Barbara, California, in December 2018, two days after Christmas, an April 8 trial brief filed by her attorneys says."
2) Jonathan M. Gitlin reports that Cybertruck owners allege pedal problem as Tesla suspends deliveries:
"Yesterday, a Cybertruck owner on TikTok posted a video showing how the metal cover of his accelerator pedal allegedly worked itself partially loose and became jammed underneath part of the dash. The driver was able to stop the car with the brakes and put it in park. At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem."
Unintended acceleration problems have plagued Tesla for years. In this case "This cover became partially detached from the accelerator pedal. And then became stuck underneath some trim, jamming the accelerator on full."
3) Jonathan M. Gitlin also reports that Tesla to lay off more than 10 percent of its workers as sales slow:
"Times are starting to get tough for Tesla. The electric vehicle automaker had been riding high, with quarter after quarter of successive growth and plenty of profits in the process. But lately, that success has mostly been due to a series of price cuts meant to tempt customers to buy into an aging lineup. This March, the company reported its first quarterly decline since 2020.
Now, it plans to lay off more than 10 percent of its workforce, according to an internal memo seen by Reuters."
4) Edward Ludlow and Dana Hull report that Tesla Executive Baglino Leaves as Musk Loses Another Top Deputy:
"Two of Tesla Inc.’s top executives have left in the midst of the carmaker’s largest-ever round of job cuts, as slowing electric-vehicle demand leads the company to reduce its global headcount by more than 10%.
The cuts could reach closer to 20% in some divisions, two people familiar with the matter say."
Bryce Elder's Tesla’s Q1 is going to be a wreck. Will anyone care? adds to the bad news:
ReplyDelete"Musk’s pivot to moonshots since then has been dramatic, so Tesla’s first-quarter results due April 23 are probably going to be awful. When faced with a 20 per cent quarter-on-quarter slump in deliveries there are not enough costs to cut, and while consensus forecasts have been grinding lower, they’re probably not low enough."
The moonshot is Fake Self-Driving for robotaxis. Musk tweeted:
"Not quite betting the company, but going balls to the wall for autonomy is a blindingly obvious move.
Everything else is like variations on a horse carriage."
Today's stories of Tesla's state-of-the-art engineering. First, from Jonathan M. Gitlin comes Tesla recalls all 3,878 Cybertrucks over faulty accelerator pedal cover:
ReplyDelete"The problem, which affects all 3,878 Cybertrucks delivered so far, has to do with the EV's accelerator pedal. Tesla has fitted this with a metal-finish cover to match the brushed metal appearance of the truck itself—no word on whether the pedals rust, too—but it says that at some point, "an unapproved change introduced lubricant (soap) to aid in the component assembly of the pad onto the accelerator pedal. Residual lubricant reduced the retention of the pad to the pedal."
Thanks to the profile of the Cybertruck's under dash, if the pedal cover becomes partially detached it can slide up and become trapped in place, wedging the pedal down and unleashing all of the Cybertruck's substantial power"
Second, from Bradley Brownell comes Tesla Cybertruck No Match For Car Wash:
"After just a couple of months and a few thousand miles of ownership, Tik Tok user @captian.ad’s Tesla Cybertruck was effectively a several-thousand-pound paperweight for several hours. After taking his truck to the beach and stopping off at a car wash to clean it up, he parked the truck in his garage, where it decided to just stop working for a while. The screen, which runs all functions of the truck, went black, and wouldn’t respond at all, even after performing the factory prescribed reboot procedure. Not great."
"At the beginning of the month, another Cybertruck owner claimed to have crashed into a light pole due to an unintended acceleration problem.""
ReplyDeleteAs I posted on Ars this is a misstatement by Gitlin. And as I thought would happen, the misstatement is now being spread around by others without additional fact-checking (I sincerely don't blame you David, for not doing any additional fact-checking of a 24 page forum post; this is on Gitlin.).
1) https://arstechnica.com/civis/threads/tesla-recalls-all-3-878-cybertrucks-over-faulty-accelerator-pedal-cover.1500173/post-42764843
2) https://arstechnica.com/civis/threads/tesla-recalls-all-3-878-cybertrucks-over-faulty-accelerator-pedal-cover.1500173/post-42765163
Actually, that first quote from Gitlin isn't too bad (though whether the issue is with acceleration or braking is another question, as is whether the issue is a design defect or expected and preferred behavior of the antilock braking system in those particular conditions). My original complaint was with Gitlin's statement in his second article that you mention in your third comment.
ReplyDeleteRichard Currie has a lot more detail on the Cybertruck car wash problem in Tesla Cybertruck turns into world's most expensive brick after car wash:
ReplyDelete"Our hero, probably feeling the same sort of panic one experiences when a computer dies mid-use, followed the Tesla guidelines for a reboot only to be met by a disconcerting and electrical "pop." Following this reset, he said that the Tesla "T" is supposed to come up on the screen and all should be well. That did not happen.
...
He also received a call from Tesla to check on him. The advisor said that "it is a known issue in the Cybertruck that when you do a screen reset, instead of resetting in the standard two minutes, it takes five hours."
So when he had initiated that hard reset before, it had taken half the night to go through."
Note that:
"an X user plumbed the depths of the Cybertruck owner's manual, which routinely turns up comedy gold.
In this instance, attention was drawn to a line saying: "CAUTION Failure to put Cybertruck in Car Wash Mode may result in damage (for example, to the charge port or windshield wipers). Damage caused by car washes is not covered by the warranty."
But regular washing is essential:
"To prevent damage to the exterior, immediately remove corrosive substances (such as grease, oil, bird droppings, tree resin, dead insects, tar spots, road salt, industrial fallout, etc.)," it says. "Do not wait until Cybertruck is due for a complete wash. If necessary use denatured alcohol to remove tar spots and stubborn grease stains, then immediately wash the area with water and a mild, non-detergent soap to remove the alcohol."
Thank heavens the exterior is "stainless"!
Edward Ludlow and Dana Hull's Elon Musk’s Robotaxi Dreams Plunge Tesla Into Chaos reinforces the sense of desperation at Tesla:
ReplyDelete"On Tuesday, Tesla is expected to report a 40% plunge in operating profit and its first revenue decline in four years. Musk has ordered up the company’s biggest layoffs ever and staked its future on a next-generation, self-driving vehicle concept called the robotaxi. People familiar with his directives, who asked not to be identified discussing internal deliberations, are unsettled by the changes the CEO wants to push through.
...
Musk and top engineers are particularly bullish about a major change in how FSD now works. Cameras placed around the company’s cars are taking in video and using this footage to dictate how the vehicle drives, instead of relying on software code."
Wow! I wonder what inputs the "software code" was using before this?
More than two years ago, Invictus wrote:
ReplyDelete"Watching Musk’s ongoing antics, it has become very clear to me that his actions are not only destroying Twitter, but the collateral damage is taking Tesla down with it – a veritable twofer of ineptitude and rake-stepping."
Now, in Alienating Tesla Buyers by the Cybertruck-load he takes a victory lap:
"Now comes the Wall St. Journal with the headline: Elon Musk Lost Democrats on Tesla When He Needed Them Most
The proportion of Democrats buying Tesla vehicles fell by more than 60%, according to car buyers surveyed in October and November by researcher Strategic Vision.
I’m not one who’s typically inclined to take a victory lap, but this one feels like it was so obvious to see coming by bringing together data from very disparate sources and nailed the eventual outcome."
Musk's hype about Autopilot claims another innocent victim in Catalina Gaitán's Tesla driver was using Autopilot before fatal Monroe crash, police say:
ReplyDelete"A 56-year-old Snohomish man had set his Tesla Model S on Autopilot and was looking at his cellphone on Friday when he struck and killed a motorcyclist in front of him in Monroe, court records show.
...
The motorcyclist, Jeffrey Nissen, 28, of Stanwood, died at the scene, records show.
The Tesla driver told a state trooper he was driving home from having lunch in Bothell and was looking at his phone when he heard a bang and felt his car lurch forward, accelerate and hit the motorcyclist, according to the affidavit.
The man told the trooper his Tesla got stuck on top of the motorcyclist and couldn’t be moved in time to save him, the affidavit states."
Matthew Connatser reports that SpaceX workplace injury rates are rocketing:
ReplyDelete"Workplace safety data reported to the US government for 2023 indicates that SpaceX's injury rate continues to surpass the industry average.
The Occupational Safety and Health Administration (OSHA) disclosed the numbers, which were picked up by Reuters. The biz's injury rate in 2023 varied by facility, ranging from 1.5 injuries per 100 workers at SpaceX's Redmond site up to 7.6 for West Coast rocket recovery.
For reference, the average for the space industry was 0.8 injuries per 100 workers, meaning the rate in SpaceX's West Coast rocket recovery arm was almost 10 times worse than the average."
It isn't just Tesla's Autopilot that has a habit of running into stationary vehicles, as Brandon Vigliarolo reports in Ford's BlueCruise driving assistant probed by US watchdog after deaths:
ReplyDelete"The carmaker's driver-assistance technology BlueCruise has been named as the subject of an NHTSA investigation involving a pair of fatal crashes in Texas and Pennsylvania. In both cases, Ford Mustang Mach-E electric vehicles struck stationary vehicles, similar to the crashes that triggered a multi-year NHTSA investigation into Tesla's Autopilot.
"Initial investigation of both incidents confirmed that BlueCruise was engaged in each of the subject vehicles immediately prior to the collision," the NHTSA said in its preliminary investigation notice. That being the case, the investigation will focus on BlueCruise-equipped Mustang Mach-Es manufactured since 2021."