Tuesday, November 19, 2024

Driver Distraction Technology

Not this hand-off
In the aftermath of the 737 MAX crashes, I wrote First We Change How People Behave.:
The fundamental problem of autonomous vehicles sharing roads is that until you get to Level 5, you have a hand-off problem. The closer you get to Level 5, the worse the hand-off problem.
Three years earlier, Paul Vixie was more specific in Disciplining the Unoccupied Mind:
Simply put, if you give a human brain the option to perform other tasks than the one at hand, it will do so. No law, no amount of training, and no insistence by the manufacturer of an automobile will alter this fact. It's human nature, immalleable. So until and unless Tesla can robustly and credibly promise an autopilot that will imagine every threat a human could imagine, and can use the same level of caution as the best human driver would use, then the world will be better off without this feature.
Follow me below the fold for an update on the hand-off problem.

The 737 MAX crashes were a specific, especially difficult, case of the hand-off problem. It wasn't that the automation recognized a situation it couldn't cope with and initiated a hand-off to the humans. It was that the automation was wrong about the situation and the pilots had to decide to override it:
In testing performed in a simulator, Boeing test pilots recreated the conditions aboard Lion Air Flight 610 when it went down in the Java Sea in October, killing 189 people. The tests showed that the crew of the 737 MAX 8 would have only had 40 seconds to respond to the Maneuvering Characteristics Augmentation System’s (MCAS’s) attempts to correct a stall that wasn’t happening before the aircraft went into an unrecoverable dive, according to a report by The New York Times.

While the test pilots were able to correct the issue with the flip of three switches, their training on the systems far exceeded that of the Lion Air crew—and that of the similarly doomed Ethiopian Airlines Flight 302, which crashed earlier this month. The Lion Air crew was heard on cockpit voice recorders checking flight manuals in an attempt to diagnose what was going on moments before they died.
Christine Negroni's What people don’t get about why planes crash stresses the hand-off problem:
"In the crash of an Asiana Airlines Boeing 777 landing in San Francisco in 2013, investigators determined that a contributing factor was the pilots’ over-reliance on automated systems which led to an erosion in their flying skills. The investigation of the fatal flight of an Air France Airbus A330 from Rio de Janeiro to Paris in 2009 led to the conclusion that the complexity of the fly-by-wire airplane befuddled the pilots.

The 737 Max probes suggest another variation on the conundrum: Technology intended to protect against pilot error trapped the pilots. Helpless in the cockpit, they were unable to do as Captain Sully did and save the day."
Source
Now, Road and Track's Joe Kucinski reports that Tesla Has the Highest Fatal Accident Rate of All Auto Brands, Study Finds:
The study was conducted on model year 2018–2022 vehicles, and focused on crashes between 2017 and 2022 that resulted in occupant fatalities. Tesla vehicles have a fatal crash rate of 5.6 per billion miles driven, according to the study; Kia is second with a rate of 5.5, and Buick rounds out the top three with a 4.8 rate. The average fatal crash rate for all cars in the United States is 2.8 per billion vehicle miles driven.
How is this possible when "Autopilot" is standard on Teslas? The analyst behind the study explains:
So, why are Teslas — and many other ostensibly safe cars on the list — involved in so many fatal crashes? “The models on this list likely reflect a combination of driver behavior and driving conditions, leading to increased crashes and fatalities,” iSeeCars executive analyst Karl Brauer said in the report. “A focused, alert driver, traveling at a legal or prudent speed, without being under the influence of drugs or alcohol, is the most likely to arrive safely regardless of the vehicle they’re driving.”
Precisely. It seems very likely that, lulled by the experience of being cocooned in high-tech safety systems, the drivers were not focused or alert, ready to take over from the automation in a split second when danger threatened.

1 comment:

Tardigrade said...

Given the ample safety features in Teslas, I wonder what the occupant fatality rate comparison would show adjusting for these features.

Do these statistics compensate for suicides?