The reason for yet another post in the series is that Trisha Thadani, Rachel Lerman, Imogen Piper, Faiz Siddiqui and Irfan Uraizee of the Washington Post have published an extraordinarily detailed forensic analysis of the first widely-publicized fatal Autopilot crash in The final 11 seconds of a fatal Tesla Autopilot crash. This was the crash in which Autopilot failed to see a semi-trailer stopped across its path and decapitated the driver.
Below the fold I comment on the details their analysis reveals.
The TL;DR of the Post's analysis and my questions is:
- The driver set Autopilot's speed to 69 in a 55 zone. Modern cars know what the speed limit is, so why did Tesla allow this?
- The driver enabled Autopilot on a road with cross traffic despite Autopilot not being allowed on roads with cross traffic. Modern cars have GPS so they know what road they are on, so why did Tesla allow this?
- The driver took his hands off the wheel, and was clearly not paying attention. Tesla's driver monitoring system is notoriously inadequate. It would only have warned the driver to pay attention half a mile down the road. Why would Tesla think that the driver not paying attention for half a mile was OK?
- Autopilot didn't brake - if it had braked only 160' before impact the crash would not have happened. Apparently the fact that its cameras could not reliably detect cross traffic was well-known to Tesla's engineers, which is why Autopilot was not supposed to be enabled on roads with cross traffic. Why would Tesla think a system unable to detect cross traffic was safe for use on public roads?
To reconstruct the crash, The Post relied on hundreds of court documents, dash cam photos and a video of the crash taken from a nearby farm, as well as satellite imagery, NTSB crash assessment documents and diagrams, and Tesla’s internal data log, which the NTSB included in its investigation report.The driver, Jeremy Banner started the accident by violating the law:
At 6:16 a.m., Banner sets cruise control to a maximum of 69 mph, though the speed limit on U.S. 441 is 55. He turns on Autopilot 2.4 seconds later.It is typical of Tesla's disdain for the law that, although their cars have GPS and can therefore know the speed limit, they didn't bother to program Autopilot to obey the law.
Banner didn't just violate the law, he also violated Tesla's user documentation:
According to Tesla’s user documentation, Autopilot wasn’t designed to work on a highway with cross-traffic such as U.S. 441. But drivers sometimes can activate it in areas and under conditions for which it is not designed.Again, Tesla's disdain for the safety of their customers, not to mention other road users, meant that despite the car knowing which road it was on and thus whether it was a road that Autopilot should not be activated on, it allowed Banner to enable it.
Banner immediately violated the user documentation again:
Two seconds later, the Tesla’s data log registers no “driver-applied wheel torque,” meaning Banner’s hands cannot be detected on the wheel.Safety should have required an immediate warning and for Autopilot to start braking the car. But no:
If Autopilot does not detect a driver’s hands, it flashes a warning. In this case, given Banner’s speed, the warning would have come after about 25 seconds, according to the NTSB investigation.25 seconds at 69 mph is 2,530 feet or 0.48 miles. Although Tesla admits that Autopilot is a Level 2 driver assistance system which requires the driver's full attention at all times, they think it is OK for the car to drive itself for half a mile at 69mph before the driver needs to start paying attention. Of course, there would be an additional delay after the warning as the driver recovered situational awareness.
Banner does not have that long.
Combined with Tesla's pathetic "wheel torque" system for detecting whether the driver is paying attention, a 25-second delay is obviously why their cars keep running into first responders, highway dividers and other obstacles up to half a mile away from where the driver zoned out.
But from Tesla's point of view, actually enforcing the rule that the driver have their hands on the wheel and be paying attention would involve "nagging" the driver. And this would make it clear even to the fan-bois that the technology was nothing like the fantasy of Tesla's marketing. And that the idea that Teslas being used as robo-taxis would more than double Tesla's market cap was a sick joke.
Two seconds later — just before impact — the Tesla’s forward-facing camera captures this image of the truck.Banner is not paying attention, and is now doomed:
The car does not warn Banner of the obstacle. “According to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car,” the NTSB crash report says.
The Tesla continues barreling toward the tractor-trailer at nearly 69 mph. Neither Banner nor Autopilot activates the brakes.
The Tesla continues on for another 40 seconds, traveling about 1,680 feet — nearly a third of a mile — before finally coasting to a stop on a grassy median.Late braking could have saved Banner:
Braking even 1.6 seconds before the crash could have avoided the collision, The Post’s reconstruction found by reviewing braking distance measurements of a 2019 Tesla Model 3 with similar specifications, conducted by vehicle testers at Car and Driver. At this point the truck was well within view and spanning both lanes of southbound traffic.The NTSB concluded:
The NTSB investigation determined that Banner’s inattention and the truck driver’s failure to fully yield to oncoming traffic were probable causes of the crash.Where did Banner's “overreliance on automation” come from? The Post team have a suggestion:
However, the NTSB also cited Banner’s “overreliance on automation,” saying Tesla’s design “permitted disengagement by the driver” and contributed to the crash.
Banner researched Tesla for years before buying a Model 3 in 2018, his wife, Kim, told federal investigators. Around the time of his purchase, Tesla’s website featured a video showing a Tesla navigating the curvy roads and intersections of California while a driver sits in the front seat, hands hovering beneath the wheel.This video is a notorious fake:
The video, recorded in 2016, is still on the site today.
“The person in the driver’s seat is only there for legal reasons,” the video says. “He is not doing anything. The car is driving itself.”
a Tesla engineer testified that a team specifically mapped the route the car would take in the video. At one point during testing for the video, a test car crashed into a fence, according to Reuters. The engineer said in a deposition that the video was meant to show what the technology could eventually be capable of — not what cars on the road could do at the time.There is a massive disconnect between what Musk and Tesla's marketing says about their driver assistance technologies and what they tell regulators and juries:
While the video concerned Full Self-Driving, which operates on surface streets, the plaintiffs in the Banner case argue Tesla’s “marketing does not always distinguish between these systems.”
In a Riverside, Calif., courtroom last month in a lawsuit involving another fatal crash where Autopilot was allegedly involved, a Tesla attorney held a mock steering wheel before the jury and emphasized that the driver must always be in control.For the past seven years Musk has been claiming "better than human" about a system that only works in limited situations and always requires a human to be ready to take over at a moment's notice. Who are you going to believe, the richest man in the world or the fine print in the user documentation?
Autopilot “is basically just fancy cruise control,” he said.
Tesla CEO Elon Musk has painted a different reality, arguing that his technology is making the roads safer: “It’s probably better than a person right now,” Musk said of Autopilot during a 2016 conference call with reporters.
Musk made a similar assertion about a more sophisticated form of Autopilot called Full Self-Driving on an earnings call in July. “Now, I know I’m the boy who cried FSD,” he said. “But man, I think we’ll be better than human by the end of this year.”
Philip Koopman, an associate professor at Carnegie Mellon who has studied self-driving-car safety for more than 25 years, said the onus is on the driver to understand the limitations of the technology. But, he said, drivers can get lulled into thinking the technology works better than it does.Because, after all, the system claims to be "better than human". Except in court, Tesla cannot tell the truth about their Level 2 driver assistance technologies because to do so would decimate their stock price.
“If a system turns on, then at least some users will conclude it must be intended to work there,” Koopman said. “Because they think if it wasn’t intended to work there, it wouldn’t turn on.”
It should be obvious that this technology needs regulation:
The NTSB said it has repeatedly issued recommendations aiming to prevent crashes associated with systems such as Autopilot. “NTSB’s investigations support the need for federal oversight of system safeguards, foreseeable misuse, and driver monitoring associated with partial automated driving systems,” NTSB spokesperson Sarah Sulick said in a statement.But:
Four years later, despite pleas from safety investigators, regulators in Washington have outlined no clear plan to address those shortcomings, allowing the Autopilot experiment to continue to play out on American roads, with little federal intervention.As I've been pointing out for years, Musk is running a human-subject experiment with Autopilot and FSD that has not just potentially but actually lethal outcomes. It isn't even a well-designed experiment:
Not only is the marketing misleading, plaintiffs in several cases argue, the company gives drivers a long leash when deciding when and how to use the technology. Though Autopilot is supposed to be enabled in limited situations, it sometimes works on roads it’s not designed for. It also allows drivers to go short periods without touching the wheel and to set cruising speeds well above posted speed limits.The bigger problem is that there is no way for most of the subjects to provide informed consent for this experiment. As one of the involuntary subjects, if asked I would have refused consent. Steven Cliff, a former NHTSA administrator understands:
For example, Autopilot was not designed to operate on roads with cross-traffic, Tesla lawyers say in court documents for the Banner case. The system struggles to identify obstacles in its path, especially at high speeds. The stretch of U.S. 441 where Banner crashed was “clearly outside” the environment Autopilot was designed for, the NTSB said in its report. Still, Banner was able to activate it.
“Tesla has decided to take these much greater risks with the technology because they have this sense that it’s like, ‘Well, you can figure it out. You can determine for yourself what’s safe’ — without recognizing that other road users don’t have that same choice,” ...
“If you’re a pedestrian, [if] you’re another vehicle on the road,” he added, “do you know that you’re unwittingly an object of an experiment that’s happening?”