Source |
The reason for yet another post in the series is that Trisha Thadani, Rachel Lerman, Imogen Piper, Faiz Siddiqui and Irfan Uraizee of the Washington Post have published an extraordinarily detailed forensic analysis of the first widely-publicized fatal Autopilot crash in The final 11 seconds of a fatal Tesla Autopilot crash. This was the crash in which Autopilot failed to see a semi-trailer stopped across its path and decapitated the driver.
Below the fold I comment on the details their analysis reveals.
The TL;DR of the Post's analysis and my questions is:
- The driver set Autopilot's speed to 69 in a 55 zone. Modern cars know what the speed limit is, so why did Tesla allow this?
- The driver enabled Autopilot on a road with cross traffic despite Autopilot not being allowed on roads with cross traffic. Modern cars have GPS so they know what road they are on, so why did Tesla allow this?
- The driver took his hands off the wheel, and was clearly not paying attention. Tesla's driver monitoring system is notoriously inadequate. It would only have warned the driver to pay attention half a mile down the road. Why would Tesla think that the driver not paying attention for half a mile was OK?
- Autopilot didn't brake - if it had braked only 160' before impact the crash would not have happened. Apparently the fact that its cameras could not reliably detect cross traffic was well-known to Tesla's engineers, which is why Autopilot was not supposed to be enabled on roads with cross traffic. Why would Tesla think a system unable to detect cross traffic was safe for use on public roads?
To reconstruct the crash, The Post relied on hundreds of court documents, dash cam photos and a video of the crash taken from a nearby farm, as well as satellite imagery, NTSB crash assessment documents and diagrams, and Tesla’s internal data log, which the NTSB included in its investigation report.The driver, Jeremy Banner started the accident by violating the law:
At 6:16 a.m., Banner sets cruise control to a maximum of 69 mph, though the speed limit on U.S. 441 is 55. He turns on Autopilot 2.4 seconds later.It is typical of Tesla's disdain for the law that, although their cars have GPS and can therefore know the speed limit, they didn't bother to program Autopilot to obey the law.
Banner didn't just violate the law, he also violated Tesla's user documentation:
According to Tesla’s user documentation, Autopilot wasn’t designed to work on a highway with cross-traffic such as U.S. 441. But drivers sometimes can activate it in areas and under conditions for which it is not designed.Again, Tesla's disdain for the safety of their customers, not to mention other road users, meant that despite the car knowing which road it was on and thus whether it was a road that Autopilot should not be activated on, it allowed Banner to enable it.
Banner immediately violated the user documentation again:
Two seconds later, the Tesla’s data log registers no “driver-applied wheel torque,” meaning Banner’s hands cannot be detected on the wheel.Safety should have required an immediate warning and for Autopilot to start braking the car. But no:
If Autopilot does not detect a driver’s hands, it flashes a warning. In this case, given Banner’s speed, the warning would have come after about 25 seconds, according to the NTSB investigation.25 seconds at 69 mph is 2,530 feet or 0.48 miles. Although Tesla admits that Autopilot is a Level 2 driver assistance system which requires the driver's full attention at all times, they think it is OK for the car to drive itself for half a mile at 69mph before the driver needs to start paying attention. Of course, there would be an additional delay after the warning as the driver recovered situational awareness.
Banner does not have that long.
Combined with Tesla's pathetic "wheel torque" system for detecting whether the driver is paying attention, a 25-second delay is obviously why their cars keep running into first responders, highway dividers and other obstacles up to half a mile away from where the driver zoned out.
But from Tesla's point of view, actually enforcing the rule that the driver have their hands on the wheel and be paying attention would involve "nagging" the driver. And this would make it clear even to the fan-bois that the technology was nothing like the fantasy of Tesla's marketing. And that the idea that Teslas being used as robo-taxis would more than double Tesla's market cap was a sick joke.
Source |
Two seconds later — just before impact — the Tesla’s forward-facing camera captures this image of the truck.Banner is not paying attention, and is now doomed:
The car does not warn Banner of the obstacle. “According to Tesla, the Autopilot vision system did not consistently detect and track the truck as an object or threat as it crossed the path of the car,” the NTSB crash report says.
The Tesla continues barreling toward the tractor-trailer at nearly 69 mph. Neither Banner nor Autopilot activates the brakes.
Source |
The Tesla continues on for another 40 seconds, traveling about 1,680 feet — nearly a third of a mile — before finally coasting to a stop on a grassy median.Late braking could have saved Banner:
Braking even 1.6 seconds before the crash could have avoided the collision, The Post’s reconstruction found by reviewing braking distance measurements of a 2019 Tesla Model 3 with similar specifications, conducted by vehicle testers at Car and Driver. At this point the truck was well within view and spanning both lanes of southbound traffic.The NTSB concluded:
The NTSB investigation determined that Banner’s inattention and the truck driver’s failure to fully yield to oncoming traffic were probable causes of the crash.Where did Banner's “overreliance on automation” come from? The Post team have a suggestion:
However, the NTSB also cited Banner’s “overreliance on automation,” saying Tesla’s design “permitted disengagement by the driver” and contributed to the crash.
Banner researched Tesla for years before buying a Model 3 in 2018, his wife, Kim, told federal investigators. Around the time of his purchase, Tesla’s website featured a video showing a Tesla navigating the curvy roads and intersections of California while a driver sits in the front seat, hands hovering beneath the wheel.This video is a notorious fake:
The video, recorded in 2016, is still on the site today.
“The person in the driver’s seat is only there for legal reasons,” the video says. “He is not doing anything. The car is driving itself.”
a Tesla engineer testified that a team specifically mapped the route the car would take in the video. At one point during testing for the video, a test car crashed into a fence, according to Reuters. The engineer said in a deposition that the video was meant to show what the technology could eventually be capable of — not what cars on the road could do at the time.There is a massive disconnect between what Musk and Tesla's marketing says about their driver assistance technologies and what they tell regulators and juries:
While the video concerned Full Self-Driving, which operates on surface streets, the plaintiffs in the Banner case argue Tesla’s “marketing does not always distinguish between these systems.”
In a Riverside, Calif., courtroom last month in a lawsuit involving another fatal crash where Autopilot was allegedly involved, a Tesla attorney held a mock steering wheel before the jury and emphasized that the driver must always be in control.For the past seven years Musk has been claiming "better than human" about a system that only works in limited situations and always requires a human to be ready to take over at a moment's notice. Who are you going to believe, the richest man in the world or the fine print in the user documentation?
Autopilot “is basically just fancy cruise control,” he said.
Tesla CEO Elon Musk has painted a different reality, arguing that his technology is making the roads safer: “It’s probably better than a person right now,” Musk said of Autopilot during a 2016 conference call with reporters.
Musk made a similar assertion about a more sophisticated form of Autopilot called Full Self-Driving on an earnings call in July. “Now, I know I’m the boy who cried FSD,” he said. “But man, I think we’ll be better than human by the end of this year.”
Philip Koopman, an associate professor at Carnegie Mellon who has studied self-driving-car safety for more than 25 years, said the onus is on the driver to understand the limitations of the technology. But, he said, drivers can get lulled into thinking the technology works better than it does.Because, after all, the system claims to be "better than human". Except in court, Tesla cannot tell the truth about their Level 2 driver assistance technologies because to do so would decimate their stock price.
“If a system turns on, then at least some users will conclude it must be intended to work there,” Koopman said. “Because they think if it wasn’t intended to work there, it wouldn’t turn on.”
It should be obvious that this technology needs regulation:
The NTSB said it has repeatedly issued recommendations aiming to prevent crashes associated with systems such as Autopilot. “NTSB’s investigations support the need for federal oversight of system safeguards, foreseeable misuse, and driver monitoring associated with partial automated driving systems,” NTSB spokesperson Sarah Sulick said in a statement.But:
Four years later, despite pleas from safety investigators, regulators in Washington have outlined no clear plan to address those shortcomings, allowing the Autopilot experiment to continue to play out on American roads, with little federal intervention.As I've been pointing out for years, Musk is running a human-subject experiment with Autopilot and FSD that has not just potentially but actually lethal outcomes. It isn't even a well-designed experiment:
Not only is the marketing misleading, plaintiffs in several cases argue, the company gives drivers a long leash when deciding when and how to use the technology. Though Autopilot is supposed to be enabled in limited situations, it sometimes works on roads it’s not designed for. It also allows drivers to go short periods without touching the wheel and to set cruising speeds well above posted speed limits.The bigger problem is that there is no way for most of the subjects to provide informed consent for this experiment. As one of the involuntary subjects, if asked I would have refused consent. Steven Cliff, a former NHTSA administrator understands:
For example, Autopilot was not designed to operate on roads with cross-traffic, Tesla lawyers say in court documents for the Banner case. The system struggles to identify obstacles in its path, especially at high speeds. The stretch of U.S. 441 where Banner crashed was “clearly outside” the environment Autopilot was designed for, the NTSB said in its report. Still, Banner was able to activate it.
“Tesla has decided to take these much greater risks with the technology because they have this sense that it’s like, ‘Well, you can figure it out. You can determine for yourself what’s safe’ — without recognizing that other road users don’t have that same choice,” ...
“If you’re a pedestrian, [if] you’re another vehicle on the road,” he added, “do you know that you’re unwittingly an object of an experiment that’s happening?”
Source |
Ian Thomson reports that Red light for robotaxis as California suspends Cruise's license to self-drive:
ReplyDelete"California's Department of Motor Vehicles has rescinded GM-owned Cruise's right to roam the streets, citing public safety and accusing the biz of withholding information.
"Public safety remains the California DMV's top priority, and the department's autonomous vehicle regulations provide a framework to facilitate the safe testing and deployment of this technology on California public roads," the DMV said in a statement.
"When there is an unreasonable risk to public safety, the DMV can immediately suspend or revoke permits. There is no set time for a suspension."
The agency said the vehicles are "not safe for the public's operation," although just last month the agency gave the green light for their use on the streets of San Francisco. It also accused Cruise of misrepresenting the capabilities and safety data of their cars to regulators."
Ashley Berlanger's California suspends Cruise robotaxis after car dragged pedestrian 20 feet explains why the DMV shut down Cruise's robotaxis:
ReplyDelete"The suspension followed two notable accidents involving Cruise's robotaxis. In August, one person was injured after a Cruise vehicle crashed into a fire truck, CNBC reported. And earlier this month, a pedestrian using a crosswalk was found in critical condition after a driver of another vehicle struck the pedestrian and threw her into the path of an oncoming Cruise robotaxi.
This hit-and-run incident is still being investigated. According to Cruise, its autonomous vehicle (AV) detected the collision and stopped on top of the pedestrian, then veered off the road, dragging the pedestrian about 20 feet. When the AV finally stopped, it appeared to pin the pedestrian's leg beneath a tire while videos showed the pedestrian was screaming for help."
Once upon a time I was a lot more optimistic about self-driving cars, mostly because humans are constantly cutting safety corners, and because an automated car could enforce safety algorithmically -- example, don't drive faster than you can stop when your vision is obstructed, or (personal bicycle rule) 3 or more children/dogs means walking speed.
ReplyDeleteOne of the things that made me reconsider was a Tesla video: https://vimeo.com/192179727 At 2:19, in a video that runs I think at double speed, the Tesla makes a safety mistake that I would not make on a bicycle, and I know that I would not because I spotted the mistake on the first viewing -- there's a guy walking two dogs right by the edge of the road, but the Tesla neither swerves to leave room nor slows down. If you're optimizing safety, you do not do that because dogs are not predictable and there's not enough clearance (at that speed) to leave them room to be unpredictable. It's an obvious (to me) mistake.
I don't know that a typical driver would have noticed the problem (I have a low opinion of typical drivers), but Tesla ought to employ safety experts, and I expect an actual paid-professional-expert would be even better at this than experienced-amateur-me. That this video went out with this flaw in it was not confidence-inspiring, since they were actually proud of it. Do they not care about safety? Do they have no experts on staff? Either way, a bad indicator.
Andrew J. Hawkins reports that Hertz is scaling back its EV ambitions because its Teslas keep getting damaged:
ReplyDelete"repair costs are about double what the company spends on gas car fixes, Hertz CEO Stephen Scherr told Bloomberg.
...
Of the 100,000 Tesla acquired by Hertz, half were to be allocated to Uber drivers as part of a deal with the ridehail company. And drivers said they loved the Teslas! But Uber drivers also tend to drive their vehicles into the ground. This higher rate of utilization can lead to a lot of damage — certainly more than Hertz was anticipating.
...
Price cuts have taken another toll on Hertz. “The MSRP [manufacturer suggested retail price] declines in EVs over the course of 2023, driven primarily by Tesla, have driven the fair market value of our EVs lower as compared to last year, such that a salvage creates a larger loss and, therefore, greater burden,” Scherr said."
David Welch casts more doubt on the Tesla robotaxi valuation hype with Cruise’s Suspension Marks a Setback for GM CEO Barra’s Vision:
ReplyDelete"A couple of hours after Chief Executive Officer Mary Barra finished telling Wall Street analysts that the Cruise self-driving unit was an undervalued piece of General Motors Co., California suspended its robotaxi license citing a risk to public safety.
The situation went from bad to worse on Thursday after Cruise said it would cease operations in all four cities where it charges for rides. That sapped what little revenue there was from a business Barra hoped would bring in $1 billion in fares by 2025 and help double GM’s sales to $240 billion in seven years.
Now, GM has a division spending about $700 million a quarter that is rife with regulatory, legal and reputational problems."
Jonathan M. Gitlin reports that Tesla Autopilot not responsible for 2019 fatal crash, jury says:
ReplyDelete"Tesla's controversial driver assistance feature Autopilot has received another pass. On Tuesday a jury in California found that Autopilot was not to blame for a 2019 crash in Riverside County that killed the driver and left his wife and son severely injured. That marks the second time this year a jury has found that Autopilot was not responsible for a serious crash.
The case was filed by the two survivors of the Riverside crash and alleged that an Autopilot malfunction caused Micah Lee's Tesla Model 3 to veer off a highway at 65 mph (105 km/h) before it struck a tree and burst into flames. Lee died in the crash, and his wife and then-8-year-old son were seriously injured; as a result the plaintiffs asked for $400 million plus punitive damages."
Lindsay Clark reports on Rishi Sunak's continuing infatuation with technology in
ReplyDeleteUK signals legal changes to self-driving vehicle liabilities:
"In the King's Speech this week, in which the governing party sets out its legislative program, King Charles III said ministers would "introduce new legal frameworks to support the safe commercial development of emerging industries, such as self-driving vehicles" in the Automated Vehicles Bill.
In guidance notes [PDF] accompanying the speech, the government said it needed to update the law to "ensure the potential benefits of self-driving technologies can become a reality." The government expect this transport revolution to create a UK market of up to £42 billion ($51.3 billion) and 38,000 skilled jobs by 2035.
The proposed legislation – details of which will become available when it is introduced to Parliament – would put "safety and the protection of the user at the heart of our new regime and makes sure that only the driver – be it the vehicle or person – is accountable, clarifying and updating the law," the government promised.
...
Jesse Norman, Department for Transport minister, said there was "potentially likely to be an offence [for the user] not to be available for a transition request from a vehicle," for which there would be "severe sanctions."
...
UCL professor Jack Stilgoe said "a regime of data sharing" would be "absolutely key" to ensure self-driving vehicles operate safely, although industry representatives remained sceptical about manufacturers' appetite for sharing data.
The UK government has promised to put laws in place to "unlock a transport revolution" in self-driving vehicles."
Today's Elon Muck assholery is a two-fer.
ReplyDeleteFirst, Reuters' Marisa Taylor's At SpaceX, worker injuries soar in Elon Musk’s rush to Mars:
"Reuters documented at least 600 previously unreported workplace injuries at Musk’s rocket company: crushed limbs, amputations, electrocutions, head and eye wounds and one death. SpaceX employees say they’re paying the price for the billionaire’s push to colonize space at breakneck speed.
...
The more than 600 SpaceX injuries Reuters documented represent only a portion of the total case count, a figure that is not publicly available. OSHA has required companies to report their total number of injuries annually since 2016, but SpaceX facilities failed to submit reports for most of those years. About two-thirds of the injuries Reuters uncovered came in years when SpaceX did not report that annual data, which OSHA collects to help prioritize on-site inspections of potentially dangerous workplaces.
...
For years, Musk and his deputies found it “hilarious” to wave the flamethrower around, firing it near other people and giggling “like they were in middle school,” one engineer said. Musk tweeted in 2018 that the flamethrower was “guaranteed to liven up any party!” At SpaceX, Musk played with the device in close-quarters office settings, said the engineer, who at one point feared Musk would set someone’s hair on fire.
Musk also became known in California and Texas for ordering machinery that was painted in industrial safety yellow to be repainted black or blue because of his aversion to bright colors, according to three former SpaceX supervisors. Managers also sometimes told workers to avoid wearing safety-yellow vests around Musk, or to replace yellow safety tape with red, the supervisors said."
Second, Marco Margaritoff's Grimes Says Elon Musk Evaded Being Served With Child Custody Papers At Least 12 Times:
"The tech billionaire reportedly evaded numerous process servers in at least a dozen locations after Grimes sued Musk in late September for physical custody of their three children, according to court documents obtained Friday by Insider.
Proof-of-service papers filed last week showed Grimes hired four people who, between Oct. 13 and Oct. 20, tried serving Musk at the X headquarters in San Francisco, the SpaceX launch site in Boca Chica, Texas, and his Tesla gigafactory in Austin."
Ashley Berlanger reports on preliminary findings by Reid Scott, the judge in the Jeremy Banner trial, in Elon Musk and Tesla ignored Autopilot’s fatal flaws, judge says evidence shows:
ReplyDelete"A Florida judge, Reid Scott, has ruled that there's "reasonable evidence" to conclude that Tesla and its CEO Elon Musk knew of defects in Autopilot systems and failed to fix them. Testimony from Tesla engineers and internal documents showed that Musk was "intimately involved" in Tesla's Autopilot program and "acutely aware" of a sometimes-fatal defect—where Autopilot repeatedly fails to detect cross traffic, Scott wrote.
"Knowing that the Autopilot system had previously failed, had limitations" and, according to one Tesla Autopilot systems engineer, "had not been modified, Tesla still permitted the 'Autopilot' system to be engaged on roads that encountered areas of cross traffic," Scott wrote.
...
Seemingly most worrying to the judge, Tesla engineers told Scott that following Banner's death in 2019 and the "eerily similar" death of another Florida driver, Joshua Brown, in 2016, the automaker did nothing to intervene or update Autopilot's cross-traffic detection system. Tesla continues to market the Autopilot feature as safe for drivers today, Scott wrote."
The judge was granting a motion by Banner's wife to seek punitive damages. In particular the judge focused on the ELon Musk demo video:
"Scott noted that Tesla's marketing of Autopilot is "important" in this case, pointing to a 2016 video that's still on Tesla's website, where Tesla claimed "the car is driving itself." The video, Scott wrote, shows the car navigating scenarios "not dissimilar" to the cross-traffic encounter with a truck that killed Banner's husband.
"Absent from this video is any indication that the video is aspirational or that this technology doesn’t currently exist in the market," Scott wrote."
Ex-Tesla employee casts doubt on car safety by Zoe Kleinman & Liv McMahon reports that:
ReplyDelete"Lukasz Krupski leaked data, including customer complaints about Tesla's braking and self-driving software, to German newspaper Handelsblatt in May.
...
"I don't think the hardware is ready and the software is ready," he said.
"It affects all of us because we are essentially experiments in public roads. So even if you don't have a Tesla, your children still walk in the footpath."
Mr Krupski said he had found evidence in company data which suggested that requirements relating to the safe operation of vehicles that had a certain level of autonomous or assistive-driving technology had not been followed.
He added that even Tesla employees had spoken to him about vehicles randomly braking in response to non-existent obstacles - known as "phantom braking". This also came up in the data he obtained around customer complaints."
I was involved in a "phantom braking" incident.
Brandon Vigliarolo reports that Tesla says California's Autopilot action violates its free speech rights:
ReplyDelete"It may have taken more than a year, but Tesla has finally responded to the California Department of Motor Vehicles allegations that it misrepresented Autopilot's capabilities, arguing that it's free to do so under the US Constitution.
In a document [PDF] filed with California's Office of Administrative Hearings last week lawyers representing Elon Musk's electric car company didn't directly challenge the DMV's specific allegations that Tesla may have overblown Autopilot's autonomy, marketing it less as an advanced driver assist system (ADAS) and more of a full self-driving platform.
Instead, they said that the DMV's case, filed in July 2022, ought to be tossed because it's "facially invalid under the First Amendment to the United States Constitution and Article I, Section 2, of the California Constitution."
Tesla blamed drivers for failures of parts it long knew were defective by Hyunjoo Jin, Kevin Krolicki, Marie Mannes and Steve Stecklow continues Reuters' reporting on Musk's unethical business practices:
ReplyDelete"The chronic failures, many in relatively new vehicles, date back at least seven years and stretch across Tesla’s model lineup and across the globe, from China to the United States to Europe, according to the records and interviews with more than 20 customers and nine former Tesla managers or service technicians.
Individual suspension or steering issues with Teslas have been discussed online and in news accounts for years. But the documents, which have not been previously reported, offer the most comprehensive view to date into the scope of the problems and how Tesla handled what its engineers have internally called part “flaws” and “failures.” The records and interviews reveal for the first time that the automaker has long known far more about the frequency and extent of the defects than it has disclosed to consumers and safety regulators.
The documents, dated between 2016 and 2022, include repair reports from Tesla service centers globally; analyses and data reviews by engineers on parts with high failure rates; and memos sent to technicians globally, instructing them to tell consumers that broken parts on their cars were not faulty.
...
Tesla’s handling of suspension and steering complaints reflects a pattern across Musk’s corporate empire of dismissing concerns about safety or other harms raised by customers, workers and others as he rushes to roll out new products or expand sales, Reuters has found."
Some indication of why Tesla employees are reluctant to admit faults comes from Noam Scheiber's SpaceX Illegally Fired Workers Critical of Musk, Federal Agency Says:
ReplyDelete"Federal labor officials accused the rocket company SpaceX on Wednesday of illegally firing eight employees for circulating a letter critical of the company’s founder and chief executive, Elon Musk.
According to a complaint issued by a regional office of the National Labor Relations Board, the company fired the employees in 2022 for calling on SpaceX to distance itself from social media comments by Mr. Musk, including one in which he mocked sexual harassment accusations against him."
The headline and subhead of Emily Glazer and Kirsten Grind's WSJ article says it all:
ReplyDeleteElon Musk Has Used Illegal Drugs, Worrying Leaders at Tesla and SpaceX
Some executives and board members fear the billionaire’s use of drugs—including LSD, cocaine, ecstasy, mushrooms and ketamine—could harm his companies
Paul Walsh reports that Filings: Tesla driver says if he did kill Mille Lacs doctor, he might have been on autopilot, distracted:
ReplyDelete"An Edina man who initially denied killing a small-town doctor in a hit-and-run near Lake Mille Lacs last fall told investigators that he didn't remember hitting the woman with his Tesla — but if he did, he would have been driving on autopilot and checking emails."
So that's OK then, it wasn't his fault, he just believed Elon Musk not the Tesla documentation. So it wasn't Tesla's fault either.
You and I are at risk from driving and walking around in company with idiots like this. At least the Model Y alongside me on Alma St. had the license plate HAL FSD as a warning - the driver did not appear to have hands on the wheel.
The Washington Post team of Trisha Thadani, Faiz Siddiqui, Rachel Lerman, Whitney Shefte, Julia Wall and Talia Trackim are back with Tesla worker killed in fiery crash may be first ‘Full Self-Driving’ fatality, another detailed account of a member of Musk's cult gaining a Darwin Award:
ReplyDelete"Hans von Ohain and Erik Rossiter were on their way to play golf one afternoon in 2022 when von Ohain’s Tesla suddenly swerved off Upper Bear Creek Road. The car’s driver-assistance software, Full Self-Driving, was struggling to navigate the mountain curves, forcing von Ohain repeatedly to yank it back on course.
“The first time it happened, I was like, ‘Is that normal?’” recalled Rossiter, who described the five-mile drive on the outskirts of Denver as “uncomfortable.” “And he was like, ‘Yeah, that happens every now and then.’”
Hours later, on the way home, the Tesla Model 3 barreled into a tree and exploded in flames, killing von Ohain, a Tesla employee and devoted fan of CEO Elon Musk. Rossiter, who survived the crash, told emergency responders that von Ohain was using an “auto-drive feature on the Tesla” that “just ran straight off the road,” according to a 911 dispatch recording obtained by The Washington Post. In a recent interview, Rossiter said he believes that von Ohain was using Full Self-Driving, which — if true — would make his death the first known fatality involving Tesla’s most advanced driver-assistance technology."
It seems the driver thought that it was OK to drive home with a blood alcohol level of 0.26 because he believed Musk's hype that Fake Self Driving would handle it despite having to repeatedly override it on the way out. Does this make you feel safe around Teslas?
According to the Washington Post article:
ReplyDelete"A Tesla driver who caused an eight-car pileup with multiple injuries on the San Francisco-Oakland Bay Bridge in 2022 told police he was using Full Self-Driving."