In the aftermath of the 737 MAX crashes, I wrote
First We Change How People Behave.:
The fundamental problem of autonomous vehicles sharing roads is that until you get to Level 5, you have a hand-off problem. The closer you get to Level 5, the worse the hand-off problem.
Three years earlier, Paul Vixie was more specific in
Disciplining the Unoccupied Mind:
Simply put, if you give a human brain the option to perform other tasks than the one at hand, it will do so. No law, no amount of training, and no insistence by the manufacturer of an automobile will alter this fact. It's human nature, immalleable. So until and unless Tesla can robustly and credibly promise an autopilot that will imagine every threat a human could imagine, and can use the same level of caution as the best human driver would use, then the world will be better off without this feature.
Follow me below the fold for an update on the hand-off problem.
The 737 MAX crashes were a specific, especially difficult, case of the hand-off problem. It wasn't that the automation recognized a situation it couldn't cope with and initiated a hand-off to the humans. It was that the automation was wrong about the situation and the pilots had to decide to
override it:
In testing performed in a simulator, Boeing test pilots recreated the conditions aboard Lion Air Flight 610 when it went down in the Java Sea in October, killing 189 people. The tests showed that the crew of the 737 MAX 8 would have only had 40 seconds to respond to the Maneuvering Characteristics Augmentation System’s (MCAS’s) attempts to correct a stall that wasn’t happening before the aircraft went into an unrecoverable dive, according to a report by The New York Times.
While the test pilots were able to correct the issue with the flip of three switches, their training on the systems far exceeded that of the Lion Air crew—and that of the similarly doomed Ethiopian Airlines Flight 302, which crashed earlier this month. The Lion Air crew was heard on cockpit voice recorders checking flight manuals in an attempt to diagnose what was going on moments before they died.
Christine Negroni's
What people don’t get about why planes crash stresses the hand-off problem:
"In the crash of an Asiana Airlines Boeing 777 landing in San Francisco in 2013, investigators determined that a contributing factor was the pilots’ over-reliance on automated systems which led to an erosion in their flying skills. The investigation of the fatal flight of an Air France Airbus A330 from Rio de Janeiro to Paris in 2009 led to the conclusion that the complexity of the fly-by-wire airplane befuddled the pilots.
The 737 Max probes suggest another variation on the conundrum: Technology intended to protect against pilot error trapped the pilots. Helpless in the cockpit, they were unable to do as Captain Sully did and save the day."
Now,
Road and Track's Joe Kucinski reports that
Tesla Has the Highest Fatal Accident Rate of All Auto Brands, Study Finds:
The study was conducted on model year 2018–2022 vehicles, and focused on crashes between 2017 and 2022 that resulted in occupant fatalities. Tesla vehicles have a fatal crash rate of 5.6 per billion miles driven, according to the study; Kia is second with a rate of 5.5, and Buick rounds out the top three with a 4.8 rate. The average fatal crash rate for all cars in the United States is 2.8 per billion vehicle miles driven.
How is this possible when "Autopilot" is standard on Teslas? The analyst behind the study
explains:
So, why are Teslas — and many other ostensibly safe cars on the list — involved in so many fatal crashes? “The models on this list likely reflect a combination of driver behavior and driving conditions, leading to increased crashes and fatalities,” iSeeCars executive analyst Karl Brauer said in the report. “A focused, alert driver, traveling at a legal or prudent speed, without being under the influence of drugs or alcohol, is the most likely to arrive safely regardless of the vehicle they’re driving.”
Precisely. It seems very likely that, lulled by the experience of being cocooned in high-tech safety systems, the drivers were not focused or alert, ready to take over from the automation in a split second when danger threatened.
3 comments:
Given the ample safety features in Teslas, I wonder what the occupant fatality rate comparison would show adjusting for these features.
Do these statistics compensate for suicides?
From the "what have they got to hide" department comes Trump team to scrap car-crash disclosure rule opposed by Tesla – report:
"The Trump transition team has recommended the incoming administration to abolish car-crash reporting requirement that has been opposed by Tesla, reported Reuters, citing a document it has seen.
This recommendation, if enacted, could hinder the government's capability to monitor and regulate the safety of automated-driving systems."
Musk says that Autopilot and Fake Self Driving are much safer than human drivers, so why wouldn't they want the government to release this information? Could it be that he isn't being truthful?
Severin Carrell reports that Driverless bus service in Scotland to be withdrawn due to lack of interest:
"The UK’s first driverless bus service, originally heralded as a breakthrough of global significance, is being withdrawn from service because too few passengers used it.
The autonomous buses, operated by Stagecoach, have been running between Fife and Edinburgh along a 14-mile route over the Forth road bridge since May 2023 to relieve the heavy congestion which can bring traffic to a standstill."
...
Built at an estimated cost of more than £6m, partly funded by the UK government, the fleet of five single-decker buses had the capacity to carry 10,000 passengers a week"
This is worth noting:
"but needed two crew on board for safety reasons."
In other words, a "driverless" bus needed twice the crew than a conventional bus.
Post a Comment