Tuesday, April 6, 2021

Elon Musk: Threat or Menace?

Although both Tesla and SpaceX are major engineering achievements, Elon Musk seems completely unable to understand the concept of externalities, unaccounted-for costs that society bears as a result of these achievements.

First, in Tesla: carbon offsetting, but in reverse, Jaime Powell reacted to Tesla taking $1.6B in carbon offsets which provided the only profit Tesla ever made and putting them into Bitcoin:
Looked at differently, a single Bitcoin purchase at a price of ~$50,000 has a carbon footprint of 270 tons, the equivalent of 60 ICE cars.

Tesla’s average selling price in the fourth quarter of 2020? $49,333.

We’re not sure about you, but FT Alphaville is struggling to square the circle of “buy a Tesla with a bitcoin and create the carbon output of 60 internal combustion engine cars” with its legendary environmental ambitions.

Unless, of course, that was never the point in the first place.
Below the fold, more externalities Musk is ignoring.

Second, there is Musk's obsession with establishing a colony on Mars. Even assuming SpaceX can stop their Starship second stage exploding on landing, and do the same with the much bigger first stage, the Mars colony scheme would have massive environmental impacts. Musk envisages a huge fleet of Starships ferrying people and supplies to Mars for between 40 and 100 years. The climate effects of dumping this much rocket exhaust into the upper atmosphere over such a long period would be significant. The idea that a world suffering the catastrophic effects of climate change could sustain such an expensive program over many decades simply for the benfit of a miniscule fraction of the population is laughable.

These externalities are in the future. But there are a more immediate set of externalities.

Back in 2017 I expressed my skepticism about "Level 5" self-driving cars in Techno-hype part 1, stressing that the problem was that to get to Level 5, or as Musk calls it "Full Self-Driving", you need to pass through the levels where the software has to hand-off to the human. And the closer you get to Level 5, the harder this problem becomes:
Suppose, for the sake of argument, that self-driving cars three times as good as Waymo's are in wide use by normal people. A normal person would encounter a hand-off once in 15,000 miles of driving, or less than once a year. Driving would be something they'd be asked to do maybe 50 times in their life.

Even if, when the hand-off happened, the human was not "climbing into the back seat, climbing out of an open car window, and even smooching" and had full "situational awareness", they would be faced with a situation too complex for the car's software. How likely is it that they would have the skills needed to cope, when the last time they did any driving was over a year ago, and on average they've only driven 25 times in their life? Current testing of self-driving cars hands-off to drivers with more than a decade of driving experience, well over 100,000 miles of it. It bears no relationship to the hand-off problem with a mass deployment of self-driving technology.
Mack Hogan's Tesla's "Full Self Driving" Beta Is Just Laughably Bad and Potentially Dangerous starts:
A beta version of Tesla's "Full Self Driving" Autopilot update has begun rolling out to certain users. And man, if you thought "Full Self Driving" was even close to a reality, this video of the system in action will certainly relieve you of that notion. It is perhaps the best comprehensive video at illustrating just how morally dubious, technologically limited, and potentially dangerous Autopilot's "Full Self Driving" beta program is.
Hogan sums up the lesson of the video:
Tesla's software clearly does a decent job of identifying cars, stop signs, pedestrians, bikes, traffic lights, and other basic obstacles. Yet to think this constitutes anything close to "full self-driving" is ludicrous. There's nothing wrong with having limited capabilities, but Tesla stands alone in its inability to acknowledge its own shortcomings.
Hogan goes on to point out the externalities:
When technology is immature, the natural reaction is to continue working on it until it's ironed out. Tesla has opted against that strategy here, instead choosing to sell software it knows is incomplete, charging a substantial premium, and hoping that those who buy it have the nuanced, advanced understanding of its limitations—and the ability and responsibility to jump in and save it when it inevitably gets baffled. In short, every Tesla owner who purchases "Full Self-Driving" is serving as an unpaid safety supervisor, conducting research on Tesla's behalf. Perhaps more damning, the company takes no responsibility for its actions and leaves it up to driver discretion to decide when and where to test it out.

That leads to videos like this, where early adopters carry out uncontrolled tests on city streets, with pedestrians, cyclists, and other drivers unaware that they're part of the experiment. If even one of those Tesla drivers slips up, the consequences can be deadly.
Of course, the drivers are only human so they do slip up:
the Tesla arrives at an intersection where it has a stop sign and cross traffic doesn't. It proceeds with two cars incoming, the first car narrowly passing the car's front bumper and the trailing car braking to avoid T-boning the Model 3. It is absolutely unbelievable and indefensible that the driver, who is supposed to be monitoring the car to ensure safe operation, did not intervene there.
An example of the kinds of problems that can be caused by autonomous vehicles behaving in ways that humans don't expect is reported by Timothy B. Lee in Fender bender in Arizona illustrates Waymo’s commercialization challenge:
A white Waymo minivan was traveling westbound in the middle of three westbound lanes on Chandler Boulevard, in autonomous mode, when it unexpectedly braked for no reason. A Waymo backup driver behind the wheel at the time told Chandler police that "all of a sudden the vehicle began to stop and gave a code to the effect of 'stop recommended' and came to a sudden stop without warning."

A red Chevrolet Silverado pickup behind the vehicle swerved to the right but clipped its back panel, causing minor damage. Nobody was hurt.
The Tesla in the video made a similar unexpected stop. Lee stresses that, unlike Tesla's, Waymo's responsible test program has resulted in a generally safe product, but not one that is safe enough:
Waymo has racked up more than 20 million testing miles in Arizona, California, and other states. This is far more than any human being will drive in a lifetime. Waymo's vehicles have been involved in a relatively small number of crashes. These crashes have been overwhelmingly minor with no fatalities and few if any serious injuries. Waymo says that a large majority of those crashes have been the fault of the other driver. So it's very possible that Waymo's self-driving software is significantly safer than a human driver.
...
The more serious problem for Waymo is that the company can't be sure that the idiosyncrasies of its self-driving software won't contribute to a more serious crash in the future. Human drivers cause a fatality about once every 100 million miles of driving—far more miles than Waymo has tested so far. If Waymo scaled up rapidly, it would be taking a risk that an unnoticed flaw in Waymo's programming could lead to someone getting killed.
I'm a pedestrian, cyclist and driver in an area infested with Teslas owned, but potentially not actually being driven, by fanatical early adopters and members of the cult of Musk. I'm personally at risk from these people believing that what they paid good money for was "Full Self Driving". When SpaceX tests Starship at their Boca Chica site they take precautions, including road closures, to ensure innocent bystanders aren't at risk from the rain of debris when things go wrong. Tesla, not so much.

Of course, Tesla doesn't tell the regulators that what the cult members paid for was "Full Self Driving"; that might cause legal problems. As Timothy B. Lee reports, Tesla: “Full self-driving beta” isn’t designed for full self-driving:
"Despite the "full self-driving" name, Tesla admitted it doesn't consider the current beta software suitable for fully driverless operation. The company said it wouldn't start testing "true autonomous features" until some unspecified point in the future.
...
Tesla added that "we do not expect significant enhancements" that would "shift the responsibility for the entire dynamic driving task to the system." The system "will continue to be an SAE Level 2, advanced driver-assistance feature."

SAE level 2 is industry jargon for a driver-assistance systems that perform functions like lane-keeping and adaptive cruise control. By definition, level 2 systems require continual human oversight. Fully driverless systems—like the taxi service Waymo is operating in the Phoenix area—are considered level 4 systems."
There is an urgent need for regulators to step up and stop this dangerous madness:
  • The NHTSA should force Tesla to disable "Full Self Driving" in all its vehicles until the technology has passed an approved test program
  • Any vehicles taking part in such a test program on public roads should be clearly distinguishable from Teslas being driven by actual humans, for example with orange flashing lights. Self-driving test vehicles from less irresponsible companies such as Waymo are distinguishable in this way, Teslas in which some cult member has turned on "Full Self Driving Beta" are not.
  • The FTC should force Tesla to refund, with interest, every dollar paid by their customers under the false pretense that they were paying for "Full Self Driving".

19 comments:

  1. Aaron Gordon's This Is the Most Embarrassing News Clip in American Transportation History is a brutal takedown of yet another of Elon Musk's fantasies:

    "Last night, Shepard Smith ran a segment on his CNBC show revealing Elon Musk's Boring Campany's new Las Vegas car tunnel, which was paid for by $50 million in taxpayer dollars. It is one of the most bizarre and embarrassing television segments in American transportation history, a perfect cap for one of the most bizarre and embarrassing transportation projects in American history."

    ReplyDelete
  2. Eric Berger's A new documentary highlights the visionary behind space settlement reviews The High Frontier: The Untold Story of Gerard K. O'Neill:

    "O'Neill popularized the idea of not just settling space, but of doing so in free space rather than on the surface of other planets or moons. His ideas spread through the space-enthusiast community at a time when NASA was about to debut its space shuttle, which first flew in 1981. NASA had sold the vehicle as offering frequent, low-cost access to space. It was the kind of transportation system that allowed visionaries like O'Neill to think about what humans could do in space if getting there were cheaper.

    The concept of "O'Neill cylinders" began with a question he posed to his physics classes at Princeton: "Is a planetary surface the right place for an expanding industrial civilization?" As it turned out, following their analysis, the answer was no. Eventually, O'Neill and his students came to the idea of free-floating, rotating, cylindrical space colonies that could have access to ample solar energy."

    However attractive the concept is in the far future, I need to point out that pursuing it before the climate crisis has been satisfactorily resolved will make the lives of the vast majority of humanity worse for the benefit of a tiny minority.

    ReplyDelete
  3. ‘No one was driving the car’: 2 men dead after fiery Tesla crash in Spring, officials say :

    "Harris County Precinct 4 Constable Mark Herman told KPRC 2 that the investigation showed “no one was driving” the fully-electric 2019 Tesla when the accident happened. There was a person in the passenger seat of the front of the car and in the rear passenger seat of the car."

    ReplyDelete
  4. Timothy B. Lee's Consumer Reports shows Tesla Autopilot works with no one in the driver’s seat reports:

    "Tesla defenders also insisted that Autopilot couldn't have been active because the technology doesn't operate unless someone is in the driver's seat. Consumer Reports decided to test this latter claim by seeing if it could get Autopilot to activate without anyone in the driver's seat.

    It turned out not to be very difficult.

    Sitting in the driver's seat, Consumer Reports' Jake Fisher enabled Autopilot and then used the speed dial on the steering wheel to bring the car to a stop. He then placed a weighted chain on the steering wheel (to simulate pressure from a driver's hands) and hopped into the passenger seat. From there, he could reach over and increase the speed using the speed dial.

    Autopilot won't function unless the driver's seatbelt is buckled, but it was also easy to defeat this check by threading the seatbelt behind the driver.
    ...
    the investigation makes clear that activating Autopilot without being in the driver's seat requires deliberately disabling safety measures. Fisher had to buckle the seatbelt behind himself, put a weight on the steering wheel, and crawl over to the passenger seat without opening any doors. Anybody who does that knows exactly what they're doing. Tesla fans argue that people who deliberately bypass safety measures like this have only themselves to blame if it leads to a deadly crash."

    Well, yes, but Musk's BS has been convincing them to try stunts like this for years. He has to be held responsible, and he has to disable "Full Self Driving" before some innocent bystanders get killed.

    ReplyDelete
  5. This Automotive News editorial is right but misses the bigger picture:

    "Tesla's years of misleading consumers about its vehicles' "full self-driving" capabilities — or lack thereof — claimed two more lives this month.
    ...
    When critics say the term "autopilot" gives the impression that the car can drive without oversight, Tesla likes to argue that that's based on an erroneous understanding of airplanes' systems. But the company exploits consumers' overconfidence in that label with the way the feature is sold and promoted without correction among Tesla's fanatical online community. Those practices encourage misunderstanding and misuse.

    In public, Musk says the company is very close to full SAE Level 5 automated driving. In conversations with regulators, the company admits that Autopilot and Full Self-Driving are Level 2 driver-assist suites, not unlike those sold by many other automakers.

    This nation does not have a good track record of holding manufacturers accountable when their products are misused by the public, which is what happened in this case."

    It isn't just the Darwin Award winners at risk, it is innocent bystanders at risk.

    ReplyDelete
  6. Bradley Brownell's Tesla Loses A Lot Of Money Selling Cars, But Makes It All Back On Credits And Bitcoin starts:

    "Tesla announced its Q1 2021 financial results in its quarterly earnings call. The company turned a surprisingly large profit this quarter, but it didn’t do it by selling cars. Q1 net profit reached a new record for Tesla, at $438 million. Revenue for the electric car company was up massively to $10.39 billion. Unfortunately, all of that profit is accounted for in the company selling $518 million in regulatory credits, and $101 million was found in buying and then later selling Bitcoin.

    That second point is particularly interesting, as Tesla purchased $1.5 billion worth of BTC, announced that the company would begin accepting BTC as payment for its cars, which drove up the value of BTC, then sold enough BTC to make a hundred million in profit. Strange how that works, eh? Surely nothing untoward going on there. Not at all. DOGE TO THE MOON! #hodlgang"

    ReplyDelete
  7. Aarian Marshall's A Fatal Crash Renews Concerns Over Tesla’s ‘Autopilot’ Claim talks about the Texas crash:

    "the incident again highlights the still-yawning gap between Tesla’s marketing of its technology and its true capabilities, highlighted in in-car dialog boxes and owners’ manuals.
    ...
    There’s a small cottage industry of videos on platforms like YouTube and TikTok where people try to “fool” Autopilot into driving without an attentive driver in the front seat; some videos show people “sleeping” in the back or behind the wheel."

    ReplyDelete
  8. Not content with losing money on cars and making it up with regulatory credits and Bitcoin, Jaime Powell's Tesla and the obscure earnings JPEG reports that Tesla went to extraordinary lengths to hide these details in its shareholder deck:

    "Remember here: Tesla’s bitcoin sale represented just over a fifth of its net income, and this is only the time it was mentioned, so this isn’t an immaterial line in the deck.

    Anyway, notice anything weird? Yes, that’s correct, the line detailing cost reductions and the $101m of bitcoin sales is not only in a different font, but a slightly different shade of grey.

    Even weirder is that the text — unlike the rest of the paragraph — can’t be highlighted."

    The reason is that the mention of Bitcoin isn't text, it is a JPEG. Almost ass if Musk is ashamed of his pump-and-dump scheme. Maybe if it been Dogecoin instead ...

    ReplyDelete
  9. In Tesla privately admits Elon Musk has been exaggerating about ‘full self-driving’ Andrew J. Hawkins reports a second admission to the California DMV that Musk has been lying about "Full Self-Driving":

    "“Elon’s tweet does not match engineering reality per CJ. Tesla is at Level 2 currently,” the California DMV said in the memo about its March 9th conference call with Tesla representatives, including the director of Autopilot software CJ Moore. Level 2 technology refers to a semi-automated driving system, which requires supervision by a human driver."

    Why do we have to wait for Musk to kill an innocent bystander before this BS gets shut down?

    ReplyDelete
  10. Why hasn’t Waymo expanded its driverless service? Here’s my theory by Timothy B. Lee explains:

    "Last October, Waymo did something remarkable: the company launched a fully driverless commercial taxi service called Waymo One. Customers in a 50-square-mile corner of suburban Phoenix can now use their smartphones to hail a Chrysler Pacifica minivan with no one in the driver's seat.

    And then... nothing. Seven months later, Waymo has neither expanded the footprint of the Phoenix service nor has it announced a timeline for launching in a second city.
    ...
    the areas that are most profitable for a ride-hailing service are also likely to be the most challenging for self-driving software to navigate safely. Ride-hailing services tend to be the most profitable in dense areas with an army of potential customers and scant parking. But those areas also tend to have crowded and chaotic streets.

    So that's why I think Waymo hasn't expanded its service footprint in the seven months since it launched driverless rides—or for that matter in the 2.5 years since it launched Waymo One with safety drivers in 2018. It's not that Waymo's software can't drive safely in Gilbert, Mesa, or Tempe. It's that Waymo ran the numbers and found that an expanded service would be unprofitable even if it could get its operating costs below those of an Uber or Lyft driver."

    ReplyDelete
  11. Jon Brodkin's Tesla owner who “drives” from back seat got arrested, then did it again reports on a case of Rich Asshole Syndrome:

    "The California Highway Patrol said it arrested a man seen riding in the back seat of a Tesla Model 3 that had no one in the driver's seat. Param Sharma, 25, was arrested "and booked into Santa Rita Jail" on counts of reckless driving and disobeying an officer, the department said in a statement Tuesday. Sharma was arrested after multiple 911 calls on Monday around 6:30 pm reported a driverless vehicle "traveling eastbound on I-80 across the San Francisco-Oakland Bay Bridge toward the city of Oakland," police said.

    Sharma spent a night locked up, and he "committed the same crime shortly after being released from jail," according to a story yesterday by KTVU Fox 2:

    Param Sharma met KTVU's Jesse Gary in San Francisco Wednesday afternoon, not far from his mother's high-rise apartment. After getting out of jail on two counts of reckless driving, he pulled up sitting in the back seat of a Tesla with no one in the driver's seat.

    When asked if he purchased a new Tesla after the previous one was impounded he said, "Yeah, I'm rich as [expletive]. I'm very rich.""

    ReplyDelete
  12. Thomas Claburn's Waymo self-driving robotaxi goes rogue with passenger inside, escapes support staff recounts an example showing that even in the benign environment of Chandler, AZ, Waymo's technology can't cope with coned-off lanes. At least it didn't crash into anything.

    ReplyDelete
  13. Lucas Robek's headline tells the whole story: Tesla Cars Will Now Spy on You to Make Sure You Don’t Autopilot Yourself Into a PR Disaster:

    "In an effort to assuage fears about the dangers inherent in its autopilot system, Tesla has announced that the in-cabin cameras in its Model 3 and Y vehicles will now monitor drivers for “attentiveness” while the feature is turned on, TechCrunch reports."

    One might have thought that this would have been part of the original design of Autopilot, as it was for other car companies.

    ReplyDelete
  14. Anonymous comes down on the side of Musk as menace:

    "people are beginning to see you as nothing more than another narcissistic rich dude who is desperate for attention"

    ReplyDelete
  15. Reuters reports that Thirty Tesla crashes linked to assisted driving system under investigation in US :

    "US safety regulators have opened 30 investigations into Tesla crashes involving 10 deaths since 2016 where an advanced driver assistance system was suspected to have been in use.

    The National Highway Traffic Safety Administration (NHTSA) released a list offering details about crashes under review by its special crash investigations programs.
    ...
    Of the 30 Tesla crashes, NHTSA has ruled out Tesla’s Autopilot in three and published reports on two of the crashes."

    ReplyDelete
  16. British Drivers Aren't Exactly Confident In Autonomous Cars by Elizabeth Blackstock reports that:

    "The study showed that, overall, 36 percent of respondents were concerned about the development of self-driving cars, with another 35 percent identifying as neutral and 30 percent as excited. 41 percent of respondents said they would not be comfortable in any self-driving car scenario, whether they were the one behind the wheel or they were sharing the road with autonomous delivery vehicles."

    ReplyDelete
  17. Cyberlaw experts: Take back control. No, we're not talking about Brexit. It's Automated Lane Keeping Systems by Jo-Ann Pattinson and Subhajit Basu looks at a typically stupid decision by Boris Johnson's administration:

    "The UK government said in April that "the first types of self-driving cars could be on UK roads this year" but this is not entirely accurate.

    Firstly, the announcement refers not to self-driving vehicles, but vehicles fitted with automated lane-keeping systems (ALKS), and secondly, we already have technology similar to this driving on our roads.
    ...
    The government’s announcement seemingly indicates the intention to allow drivers to take their eyes off the road, and for the driving assistance system to be responsible while the system is engaged."

    The authors discuss the "hand-off" problem in considerable detail, and with the same skepticism as I have long expressed.

    ReplyDelete
  18. Greg Bensinger's New York Times op-ed Why Tesla’s ‘Beta Testing’ Puts the Public at Risk is long overdue:

    "there’s nothing innocuous about the beta tests being run by Elon Musk, the billionaire C.E.O. of Tesla. He has turned American streets into a public laboratory for the company’s supposed self-driving car technology."

    ReplyDelete
  19. Tesla ordered to pay $137M to Black former worker subjected to racist workplace by Tim De Chant reports that:

    "Diaz, an elevator operator at the company’s Fremont factory for nine months from 2015 to 2016, had been called racial epithets by coworkers, was told to “go back to Africa,” and saw racist graffiti in the bathrooms. The trial lasted a little over a week, and the jury found that Tesla had not taken reasonable steps to prevent racial harassment.
    ...
    The federal jury didn’t buy the argument, though. Diaz and his attorneys had brought three claims against Tesla—that the company subjected him to a racially hostile workplace, that it failed to provide a workplace free from harassment, and that it was negligent in its supervision of employees—and the jury found in favor of the plaintiff on all claims. It ordered Tesla to pay Diaz $6.9 million in compensatory damages and $130 million in punitive damages, amounts far in excess of what his attorneys requested."

    ReplyDelete