Tuesday, April 2, 2019

First We Change How People Behave

Then the system will work the way we want. My skepticism about Level 5 self-driving cars keeps getting reinforced. Below the fold, two recent examples.

The fundamental problem of autonomous vehicles sharing roads is that until you get to Level 5, you have a hand-off problem. The closer you get to Level 5, the worse the hand-off problem.

Source
Sean Gallagher's Lion Air 737 MAX crew had seconds to react, Boeing simulation finds shows the hand-off problem for aircraft:
In testing performed in a simulator, Boeing test pilots recreated the conditions aboard Lion Air Flight 610 when it went down in the Java Sea in October, killing 189 people. The tests showed that the crew of the 737 MAX 8 would have only had 40 seconds to respond to the Maneuvering Characteristics Augmentation System’s (MCAS’s) attempts to correct a stall that wasn’t happening before the aircraft went into an unrecoverable dive, according to a report by The New York Times.

While the test pilots were able to correct the issue with the flip of three switches, their training on the systems far exceeded that of the Lion Air crew—and that of the similarly doomed Ethiopian Airlines Flight 302, which crashed earlier this month. The Lion Air crew was heard on cockpit voice recorders checking flight manuals in an attempt to diagnose what was going on moments before they died.
Great, must-read journalism from Dominic Gates at the Seattle Times, Boeing's home-town newspaper in Flawed analysis, failed oversight: How Boeing and FAA certified the suspect 737 MAX flight control system shows that the fundamental problem with the 737 MAX was regulatory capture of the FAA by Boeing; the FAA's priority wasn't to make the 737 MAX safe, it was to get it into the market as quickly as possible because Airbus had a 9-month lead in this segment. And because Airbus' fly-by-wire planes minimize the need for expensive pilot re-training, Boeing's priority was to remove the need for it.
The company had promised Southwest Airlines Co. , the plane’s biggest customer, to keep pilot training to a minimum so the new jet could seamlessly slot into the carrier’s fleet of older 737s, according to regulators and industry officials.

[Former Boeing engineer Mr. [Rick] Ludtke [who worked on 737 MAX cockpit features] recalled midlevel managers telling subordinates that Boeing had committed to pay the airline $1 million per plane if its design ended up requiring pilots to spend additional simulator time. “We had never, ever seen commitments like that before,” he said.
The software fix Boeing just announced is just a patch on a fundamentally flawed design, as George Leopold reports in Software Won’t Fix Boeing’s ‘Faulty’ Airframe. Boeing is gaming the regulations, and the FAA let them do it. Neither placed safety first. These revelations should completely destroy the credibility of FAA certifications.

Although Boeing's highly-trained test pilots didn't have to RTFM, they did have only 40 seconds to diagnose and remedy the problem caused by the faulty angle-of-attack sensor and the buggy MCAS software. Inadequately trained Lion Air and Ethiopian Airlines pilots never stood a chance of a successful hand-off. Self-driving car advocates assume that hand-offs are initiated by the software recognizing a situation it can't handle. But in this case the MCAS software was convinced, on the basis of a faulty sensor, that it was handling the situation and refused to hand-off to the pilots 24 times in succession.

Self-driving car stopper
Self-driving cars drivers will lack even the level of training of the dead pilots. The cars' software is equally dependent upon sensors, which can be fooled by stickers on the road*, and cannot handle rain, sleet or snow. Or, as it turns out, pedestrians As David Zipper tweeted:
Atrios' apt comment was:
It is this type of thing which makes me obsess about this issue. And I have a couple insider sources (ooooh I am a real journalist) who confirm these concerns. The self-driving car people see pedestrians as a problem. I don't really understand how you can think urban taxis are your business model and also think walking is the enemy. Cities are made of pedestrians. Well, cities other than Phoenix, anyway. I pay a dumb mortgage so I can walk to a concert, like I did last night.
But no-one who matters cares about pedestrians because no-one who matters is ever on the sidewalk, let alone crossing the street. As the CDC reports:
In 2016, 5,987 pedestrians were killed in traffic crashes in the United States. This averages to one crash-related pedestrian death every 1.5 hours.

Additionally, almost 129,000 pedestrians were treated in emergency departments for non-fatal crash-related injuries in 2015. Pedestrians are 1.5 times more likely than passenger vehicle occupants to be killed in a car crash on each trip.
The casualties who don't "know what they can't do" won't add much to the deaths and injuries, so we can just go ahead and deploy the technology ASAP.



* Tesla says the "stickers on the road" attack:
is not a realistic concern given that a driver can easily override Autopilot at any time by using the steering wheel or brakes and should always be prepared to do so
Well, yes, but the technology is called "Autopilot" and Musk keeps claiming "full autonomy" is just around the corner.

7 comments:

David. said...

Sean Gallagher reports that:

"Delivery of Boeing’s promised fix to the flight system software at the center of two 737 MAX crash investigations has been pushed back several weeks after an internal review by engineers not connected to the aircraft raised additional safety questions. The results of the “non-advocate” review have not been revealed, but the Federal Aviation Administration confirmed on April 1 that the software needed additional work."

David. said...

Although they did RTFM, it looks like it didn't help:

"Pilots at the controls of the Boeing Co. 737 MAX that crashed in March in Ethiopia initially followed emergency procedures laid out by the plane maker but still failed to recover control of the jet, according to people briefed on the probe’s preliminary findings."

David. said...

In Whistleblowers: FAA 737 MAX safety inspectors lacked training, certification, Sean Gallagher reports that:

"Multiple whistleblowers have raised issues over the Federal Aviation Administration’s safety inspection process connected to Boeing’s 737 MAX aircraft, according to a letter to the FAA from Senate Commerce Committee chairman Sen. Roger Wicker on April 2. And the FAA’s leadership was informed of these concerns as far back as August of 2018.

The whistleblowers cited “insufficient training and improper certification” of FAA aviation safety inspectors, “including those involved in the Aircraft Evaluation Group (AEG) for the Boeing 737 MAX," Wicker said in his letter to FAA acting administrator David Elwell."

Both Boeing and the FAA have serious credibility problems.

David. said...

Izabella Kaminska and Jamie Powell Uber's conflicting self-driving fleet vision analyzes Uber's IPO documents and shows (a) Uber is betting the future on a fleet of Level 5 cars, and (b) the economics of this bet simply don't work (and of course neither does the technology):

"But here's the really important factor for would-be buyers of the stock on IPO day. Uber says autonomous driving is essential for it to continue to effectively compete, but it also says these development efforts are capital and operations intensive (the opposite of its supposed asset-light business model today)."

The quotes they emphasize from the IPO documents are fairly devastating.

David. said...

Yet again William Gibson was prophetic. In Defense against the Darknet, or how to accessorize to defeat video surveillance, Thomas Claiburn describes a real-life version of the "ugliest T-shirt" from Gibson's Zero History.

David. said...

Julie Bort's An engineer at Uber's self-driving car unit warns that it's more like 'a science experiment' than a real car capable of driving itself shows that in autonomous cars, like everything else, Uber is following the "fake it until you make it" path of today's Silicon Valley startups.

And for the few in the audience who haven't read Gibson, the "ugliest T-shirt" makes the wearer invisible to surveillance cameras. Makes pedestrians even more of a problem for self-driving cars, no?

David. said...

Another good post on the 737-MAX crashes is How the Boeing 737 Max Disaster Looks to a Software Developer by Gregory Travis:

"So Boeing produced a dynamically unstable airframe, the 737 Max. That is big strike No. 1. Boeing then tried to mask the 737’s dynamic instability with a software system. Big strike No. 2. Finally, the software relied on systems known for their propensity to fail (angle-of-attack indicators) and did not appear to include even rudimentary provisions to cross-check the outputs of the angle-of-attack sensor against other sensors, or even the other angle-of-attack sensor. Big strike No. 3.