Tuesday, June 27, 2023

The Philosopher of Palo Alto

I just finished reading John Tinnell's The Philosopher of Palo Alto. Based on Stanford Library's extensive archive of Mark Weiser's papers, and interviews with many participants, it is an impressively detailed and, as far as I can tell, accurate account of the "ubiquitous computing" work he drove at Xerox PARC. I strongly recommend reading it. Tinnell covers Weiser's life story to his death at age 46 in 1999, the contrast between his work and that at Nick Negroponte's MIT Media Lab, and the ultimate failure of his vision.

Tinnell quotes Lucy Suchman's critique of Weiser's approach to innovation:
Under this approach, Suchman claimed, a lab "[provided] distance from practicalities that must eventually be faced" — but facing up to those practicalities was left up to staff in some other department.
To be fair, I would say the same criticism applied to much of the Media Labs work too.

As I was at the time a member of "staff in some other department" at Sun Microsystems and then Nvidia, below the fold I discuss some of the "practicalities" that should have been faced earlier rather than later or not at all.

Source
Weiser was prophetic in identifying the downsides of the model of the PC and then the phone as a single, all-consuming sink of attention. They are aptly illustrated in this story of the 13-year-old boy who averted a crash of his school bus. He was the only pupil on the bus without a phone, so he was the only one who noticed when the driver suffered a seizure. Everyone else was focused on their phone. I would certainly prefer to live in a world where Weiser's dreams had come true.

But what Weiser failed to understand was that his model of disaggregating functions into "a hundred computers per room" was economically infeasible. While he was correct in forecasting that Moore's Law would make systems-on-a-chip very cheap, this wouldn't make "a hundred computers per room" affordable. Each of those computers would need to be wrapped in a set of components to which Moore's Law didn't apply, such as batteries, displays, cases and so on. Moore's Law would merely mean that the computer part became an insignificant part of the total cost.

And even if the non-Moore's Law components were fairly cheap, equipping the typical house with, say, seven rooms would cost seven hundred times the cost of a single one of these "Internet of Things" devices.

The alternative model that won out based first on the PC and then on Steve Jobs' iPhone was to aggregate all the functions into a single hardware device configured by software to perform all of them. This had two major advantages:
  • Although the "universal device" hardware was more expensive, it was much less than a hundred times more expensive than each of the single-function devices in a room, let alone the thousands of them in all the spaces into which a user carried their phone.
  • With the advent of the iPhone, the "universal device" came equipped with a well-understood and highly profitable business model. The cost of the hardware and the underlying wireless connectivity were paid for by voice calling; the hardware needed for all the other functions wasn't simply cheap, it was free. In contrast, it is hard to see what business model would have supported each of the multitude of ubiquitous devices.
A year after Weiser joined PARC, the Morris Worm had demonstrated that networked devices such as the "hundred computers per room" lived in an adversarial environment, foreshadowing the security dumpster fire that the Internet of Things has become. I've written repeatedly about the way the need to make the Things in the Internet as cheap as possible makes proper software maintenance and security patches impossible, and so have many others. Here, for example, is Bruce Schneier in 2014:
The problem with this process is that no one entity has any incentive, expertise, or even ability to patch the software once it's shipped.
As I discussed in "Nobody cared about security", in late 80s and 90s the issue of shipping the encryption that is a necessary but not sufficient condition for the security of networked devices like those Weiser envisioned was a hot topic. Encryption was classed as a munition and exporting it from the US was potentially criminal, so even those of us who did care about security couldn't do anything about it. It wasn't until the year after Weiser's death that it became reasonably easy to ship adequate encryption. The "staff in some other department" would definitely have known that they needed a way to secure access to the devices, and to support patching their software. A quarter-century later this is still an unsolved problem, primarily for the economic reason Schneier described.

Weiser's vision of an environment in which the networked devices faded into invisibility would be a nightmare for security. It is hard enough for Apple, and much harder for Google, to get people to keep their phones fully updated even though there is typically only one device involved and the user is constantly aware of it. Imagine how much harder it would be if people were simply unaware of the existence, name and location of each of a hundred devices in each of their rooms.

Some would argue that, more than a quarter-century later, Weiser was right that there would be "a hundred computers per room". I took a census of my office, about the most computer-centric space outside a data center. Here is a list of all the CPUs I could find:
  • 1 iPhone
  • 6 laptops
  • 4 desktops
  • 2 tablets
  • 2 Raspberry Pi
  • 15 internal drives in those devices
  • 2 pairs of noise-canceling headphones
  • 4 external DVD-RW drives
  • 20 external hard drives
  • 45 (approximately) external flash devices
  • 1 label printer
  • 1 thermostat
  • 1 LCD monitor
  • 1 keyboard
  • 1 mouse
Thus a total of just over a hundred. So was Weiser right? No, what he meant was a hundred networked, context aware devices per room. The vast majority (86) of the CPUs I could find are embedded in storage devices. Only 7 of the CPUs are active, the rest are powered down or hibernating. Only 15 of the CPUs are networked, except for the Raspberry Pis all are big and expensive (>$100). Except for the iPhone none have any context awareness, and that device is pretty much the opposite of Weiser's vision.

2 comments:

  1. Dan Goodin's Zyxel users still getting hacked by DDoS botnet emerge as public nuisance No. 1 illustrates my point that Weiser's vision would have been a security nightmare:

    "Organizations that have yet to patch a 9.8-severity vulnerability in network devices made by Zyxel have emerged as public nuisance No. 1 as a sizable number of them continue to be exploited and wrangled into botnets that wage DDoS attacks.
    ...
    On Wednesday—12 weeks since Zyxel delivered a patch and seven weeks since Shadowserver sounded the alarm—security firm Fortinet published research reporting a surge in exploit activity being carried out by multiple threat actors in recent weeks. As was the case with the active compromises Shadowserver reported, the attacks came overwhelmingly from variants based on Mirai, an open source application hackers use to identify and exploit common vulnerabilities in routers and other Internet of Things devices."

    The compromised Zyxel devices are not home routers or smart refrigerators. They are corporate firewalls and VPN servers. But they are still unpatched after twelve weeks!

    ReplyDelete
  2. Two reviews of the book. First, The New Yorker, Best Book of the Week:

    "As the chief technology officer of Xerox parc, a research company and erstwhile hotbed of Silicon Valley innovation, Mark Weiser believed that screens were an “unhealthy centripetal force.” Instead of drawing people away from the world, devices should be embedded throughout our built environment—in lights, thermostats, roads, and more—enhancing our perception rather than demanding our focus. Weiser’s pioneering ideas, which he refined in the nineteen-eighties and nineties, led to the present-day Internet of Things, but his vision lost out to the surveillance-capitalist imperatives of Big Tech. Tinnell’s profound biography evokes an alternative paradigm, in which technology companies did not seek to monitor and exploit users."

    Second, Ben Tarnoff in the New York Review of Books reviews it together with Malcom Harris' Palo Alto: A History of California, Capitalism, and the World at much greater length. Tarnoff writes:

    "Weiser belonged to the second generation of PARC researchers, and he wasn’t interested in honoring his elders. He wanted to disrupt them by abolishing personal computing entirely.

    The story of Weiser’s undertaking is told by John Tinnell, a professor of English at the University of Colorado at Denver, in his new biography The Philosopher of Palo Alto, and it’s refreshingly strange. By night, Weiser played drums in a punk band; by day, he worked to eradicate the pernicious influence of RenĂ© Descartes. He saw the personal computer as perpetuating a Cartesian partition between mind and body and between person and world. Stationary, alone, transfixed by our screens, we grow more estranged from ourselves and from one another, Weiser believed. (Heidegger was a major inspiration.)"

    ReplyDelete