Tuesday, September 10, 2019

The Optimist's Telescope: Review

The fundamental problem of digital preservation is that, although it is important and we know how to do it, we don't want to pay enough to have it done. It is an example of the various societal problems caused by rampant short-termism, about which I have written frequently.

Bina Venkataraman has a new book on the topic entitled The Optimist's Telescope: Thinking Ahead in a Reckless Age. Robert H. Frank reviews it in the New York Times:
How might we mitigate losses caused by shortsightedness? Bina Venkataraman, a former climate adviser to the Obama administration, brings a storyteller’s eye to this question in her new book, “The Optimist’s Telescope.” She is also deeply informed about the relevant science.

The telescope in her title comes from the economist A.C. Pigou’s observation in 1920 that shortsightedness is rooted in our “faulty telescopic faculty.” As Venkataraman writes, “The future is an idea we have to conjure in our minds, not something that we perceive with our senses. What we want today, by contrast, we can often feel in our guts as a craving.”

She herself is the optimist in her title, confidently insisting that impatience is not an immutable human trait. Her engaging narratives illustrate how people battle and often overcome shortsightedness across a range of problems and settings.
Below the fold, some thoughts upon reading the book.

The plot of Isaac Asimov's Foundation Trilogy evolves as a series of "Seldon crises", in which simultaneous internal and external crises combine to force history into the path envisioned by psychohistorian Hari Seldon and the Foundations he established with the aim of reducing the duration of the dark ages after the fall of the Galactic Empire from 30,000 to 1,000 years.

The world today feels as though it is undergoing a Seldon crisis, with external (climate change) and internal (inequality, the rise of quasi-fascist leaders of "democracies") crises reinforcing each other.  What is lacking is a Foundation charting a long-term future that minimizes the dark ages to come after the fall of civilization.

What ties the various current crises together is short-termism; all levels of society being incapable of long-term thinking, and failing to resist eating the marshmallow. In her introduction Venkataraman writes:
I argue in this book that many decisions are made in the presence of information about future consequences but in the absence of good judgement. We try too hard to know the exact future and do too little to be ready for its many possibilities. The result is an epidemic of recklessness, a colossal failure to plan ahead.
...
To act on behalf of our future selves can be hard enough; to act on behalf of future neighbors, communities, countries of the planet can seem impossible, even if we aspire to that ideal. By contrast, it is far easier to respond to an immediate threat.
She divides her book into three parts, and in each deploys an impressive range of examples of the problems caused by lack of foresight. But it is an optimistic book, because in each part she provides techniques for applying foresight and examples of their successful application.

Part 1: Individual and Family

Dorian, the second-worst North Atlantic hurricane ever, was ravaging the Bahamas as I read Part 1's discussion of why, despite early and accurate warnings, people fail to evacuate or take appropriate precautions for hurricanes and other natural disasters:
It is human nature to rely on mental shortcuts and gut feelings - more than gauges of the odds - to make decisions. ... These patterns of thinking, I have learned, explain why all the investment on better predictions can fall short of driving decisions about the future ... The threats that people take most seriously turn out to be those we can most vividly imagine.
She illustrates why this is hard using the collapse of microfinance in Andra Pradesh:
A person who might look reckless when poor could look smart and strategic when flush. Realizing that people who are lacking resources often have a kind of tunnel vision for the present helped me understand why many women involved in India's microfinance crisis went against their own future interest, taking on too many loans and falling deep into debt. It also explains why the poorest families have more trouble heeding hurricane predictions.
The problem on the lending side of the collapse was "be careful what you measure". The microfinance companies were measuring the number of new loans, and the low default rate, not noticing that the new loans were being used to pay off old ones.

The same phenomenon of scarcity causing recklessness helps explain why black kids in schools suffer more severe discipline:
in exasperated moments, impulsive decisions reflecting ingrained biases become more likely. Teachers, like all of us, are exposed to portrayals in the media and popular culture of black people as criminals, and those images shape unconscious views and actions.

University of Oregon professors Kent McIntosh and Erik Girvan call these moments of discipline in schools "vulnerable decision points." They track discipline incidents in schools around the country and analyze the data to show school administrators and teachers are often predictable. When teachers are fatigued at the end of a school day  or week, or hungry after skipping lunch for meetings, they are more likley to make rash decisions.
...
This bears out the link Eldar Shafir and Sendhil Mullainathan have shown between scarcity - in this case, of time and attention - and reckless decision making. It is similar to the pattern that hamstrings the poor from saving for their future.
Among the techniques she discusses for imagining the future are virtual reality experiences, and simpler techniques such as:
an annual gathering where each person writes his own obituary and reads it aloud to the group.
Prototype Clock
Another is Danny Hillis' 10,000 year clock:
The clock idea captivated those whom Hillis told about it, including futurist and technology guru Stewart Brand and the musician Brian Eno.
And me. IIRC it was at the 1996 Hacker's conference where Hillis and Brand talked about the idea of the clock. The presentation set me thinking about the long-term future of digital information, and about how systems to provide it needed to be ductile rather than, like Byzantine Fault Tolerance, brittle. The LOCKSS Program was the result a couple of years later.

ERNIE 1 via Wikipedian geni
Another technique she calls "glitter bombs" - you'll need to read the book to find out why. The UK's Premium Bonds and other prize-linked savings schemes are examples:
The British government launched its Premium Bonds program in 1956, to encourage savings after World War II. For the past seven decades, between 22 and 40 percent of UK citizens have held the bonds at any given time. The savers accept lower guaranteed returns than comparable government bonds in exchange for the prospect of winning cash prizes during monthly drawings. Tufano's research shows that people who save under these schemes typically do so not instead of saving elsewhere but instead of gambling.
As kids, my brother and I routinely received small Premium Bonds as birthday or Christmas gifts. I recall watching on TV as "ERNIE" chose winners, but I don't recall ever being one.

Part 2: Businesses and Organizations

Venkataraman introduces this part thus:
The unwitting ways that organizations encourage reckless decisions may pose an even greater threat, however, than the cheating we find so repulsive. The work of John Graham at the National Bureau of Economic Research puts eye-popping scandals into perspective. He has shown that more money is lost for shareholders of corporations ... by the routine, legal habit of executives making bad long-term decisions to boost near-term profits than what is siphoned off by corporate fraud.
Among her examples of organizational short-termism are the Dust Bowl, gaming No Child Left Behind by "teaching to the test", over-prescribing of antibiotics, over-fishing, and the Global Financial Crisis (GFC). For each, she discusses examples of successful, albeit small-scale, mitigations:
  • The Dust Bowl was caused by the economic incentives, still in place, for farmers to aggressively till their soil to produce more annual crops. She describes how people are developing perennial crops, needing much less tilling and irrigation:
    Perennial grains, unlike annuals, burrow thick roots ten to twenty feet deep into the ground. Plants with such entrenched roots don't require much irrigation and they withstand drought better. Perennial roots clench the fertile topsoil like claws and keep it from washing away. This makes it possible for a rich soil microbiome to thrive that helps crops use nutrients more efficiently. A field of perennials does not need to be plowed each year, and so more carbon remains trapped in its soil instead of escaping to the atmosphere
    But:
    To get perennial grains into production, Jackson also had to figure out how to overcome farmers' aversion to taking risks on unknown crops, and their immediate fears of not having buyers for their product. Researchers from the Land Institute and University of Minnesota have brokered deals for twenty farmers to plant fields with a perennial grain that resembles wheat. They persuaded the farmers by securing buyers willing to pay a premium for the grain
    This is an impressive demonstration of making "what lasts over time pay in the short run", but scaling up to displace annual grains in the market is an exercise left to the reader.
  • Montessori and similar educational philosophies (e.g. Reggio Emilia early childhood education) are known to be effective alternatives to the testing-based No Child Left Behind. But they aren't as easy to measure, and thus to justify deploying widely. So this is what we get:
    Other reports have documented how "teaching to the test" curtails student curiosity, and how it has even driven some teachers and principals to cheat by correcting student answers. The metric might work for organizations at the bottom of the heap, but not for those near the top.
    Organizations at the bottom of the heap have low-hanging fruit, so they can see how to improve. It is much more difficult for organizations near the top to see how to improve, so the temptation to cheat is greater.
  • Doctors have been effective at curbing over-prescribing by their colleagues using an in-person, patient-specific "postgame rehash" when suspect prescriptions are detected. But:
    The drawback is that it requires a lot of time and legwork, and even hospitals with antibiotic stewardship teams lack the resources to do this across an entire hospital year-round.
    So although this approach works, it can't scale up to match the problem of over-prescribing in hospitals, let alone by GPs. And it clearly can't deal with the even more difficult problem of agricultural over-use of antibiotics.
  • Attempts to reduce over-fishing by limiting fishing days and landings haven't been effective. They lead to intensive, highly competitive "derby days" during which immature fish are killed and dumped, and prices are crashed because the entire quota arrives on the market at the same time. Instead, the approach of "catch shares", in effect giving the fishermen equity in the fishery, has driven the Gulf Coast red snapper fishery back from near-extinction:
    The success of catch shares shows that agreements to organize businesses - and wise policy - can encourage collective foresight. Programs that align future interests with the present can, in the words of Buddy Guindon, turn pirates into stewards.
    It isn't clear that it would have been possible to implement catch shares before the fishery faced extinction.
  • The Global Financial Crisis of 2008 was driven by investors' monomaniacal focus on quarterly results, and thus executives monomaniacal focus on manipulating them to enhance their stock options and annual bonuses. She responds with the story of Eagle Capital Management, a patient value investment firm which, after enduring years of sub-par performance, flourished during and after the dot-com bust:
    Eagle fared well and way outperformed the plummeting markets in 1999 and 2000. In just those two years, the gains more than made up for the losses of the previous five. Today, the company has grown to manage more than $25 billion in assets and, on average, earned an annual return of more than 13 percent on its investments between 1998 and 2018. That's more than double the annual return from the S&P 500 during that time.
    Some of my money is managed by a firm with a similar investment strategy, so I can testify to the need for patience and a long-term view. Value investing has been out of favor during the recovery from the GFC. Note that the whole reason for Eagle's success was that most competitors were doing something different; if everyone had been taking Eagle's long view the GFC wouldn't have happened but Eagle would have been a run-of-the-mill performer.
Source
She examines long-lived biological systems, including:
The Pando aspen colony in Utah, ... is more than eighty thousand years old, and it has persisted by virtue of self-propogation - cloning itself - and by slow migration to fulfill its needs for water and nutrients from the soil. It even survived the volcaninic winter spurred by the massive eruption seventy-five thousand years ago on Sumatra.
...
Its strategy - making lots of copies of itself - is one echoed by digital archivist David Rosenthal ... Lots of copies dispersed to different environments and organizations, Rosenthal told me, is the only viable survival route for the ideas and records of the digital age,
Rhizocarpon geographicum
Source
She is right that systems intended to survive for the long term needs high levels of redundancy, and low levels of correlation. She also points out another thing they need:
Another secret of some of the oldest living things on Earth is slow growth. Sussman documents what are known as map lichens in Greenland, specimens at least three thousand years old that have grown one centimeter every hundred years - a hundred times slower than the pace of continental drift.
The need to force systems to operate relatively slowly by imposing rate limits is something that I've written about several times (as has Paul Vixie), for example in Brittle Systems:
The design goal of almost all systems is to do what the user wants as fast as possible. This means that when the bad guy wrests control of the system from the user, the system will do what the bad guy wants as fast as possible. Doing what the bad guy wants as fast as possible pretty much defines brittleness in a system; failures will be complete and abrupt.
Rate limits are essential in the LOCKSS system. Another of her examples is also about rate limits. Gregg Popovich, coach of the San Antonio Spurs:
pioneered the practice of keeping star players out of games for rest to prevent later injuries.
Harrison's H4
Phantom Photographer
The equivalent of "glitter bombs" in this part are prizes, the earliest success and perhaps the most famous is the Longitude Prize, a £25,000 prize that motivated John Harrison's succession of marine chronometers (preserved in working order at the Royal Greenwich Observatory). More recent successful prizes include the X-Prize spaceship and DARPA's prizes kick-starting autonomous car technology. But note that none of the recent successful prizes have spawned technologies relevant to solving the Seldon crisis we face.

One interesting technique she details is "prospective hindsight":
In contrast to the more common practice of describing what will happen in the future, prospective hindsight requires assuming something already happened and trying to explain why. This shifts people's focus away from mere prediction of future events and toward evaluating the consequences of their current choices.
In the early days of Vitria Technology, my third startup, we worked with FedEx. On of the many impressive things about the company was their morning routine of reviewing the events of the previous 24 hours to enumerate everything that had gone wrong, and identify the root causes. Explaining why is an extremely valuable process.

Part 3: Communities and Society

Some of the examples in this part, such as the warnings of potential for terrorism at the Munich Olympics, the siting of the Fukushima reactors, the extraordinary delay in responding to the Ebola outbreak:
E-mails later published by the Associated Press revealed that officials knew of the potential danger and scope of the epidemic months before the designation [of a global emergency], and were warned of its scale by Doctors Without Borders .. The World Health Organization's leaders, however, were worried about declaring the emergency because of the possible damage to the economies of the countries at the epicenter of the outbreak.
and the Indian Ocean tsunami:
After Dr. Smith Dharmasaroja, the head meteorologist of Thailand, advocated in 1998 for creating a network of sirens to warn of incoming tsunamis in the Indian Ocean, the ruling government replaced him. His superiors argued that a coastal warning system might deter tourists, as they would see Thailand as unsafe. Six years later, a massive Indian Ocean tsunami killed more than 200,000 people including thousands in coastal Thailand, many of them tourists.
show how the focus on short-term costs has fatal consequences. One reason is "social discounting", the application of a discount rate to estimated future costs to reduce them to a "net present value". This technique might have value in purely economic computations, although as I pointed out in Two Sidelights on Short-Termism: in practice it gives wrong answers:
I've often referred to the empirical work of Haldane & Davies and the theoretical work of Farmer and Geanakoplos, both of which suggest that investors using Discounted Cash Flow (DCF) to decide whether an investment now is justified by returns in the future are likely to undervalue the future. ... Now Harvard's Greenwood & Shleifer, in a paper entitled Expectations of Returns and Expected Returns, reinforce this ... They compare investors' beliefs about the future of the stock market as reported in various opinion surveys, with the outputs of various models used by economists to predict the future based on current information about stocks. They find that when these models, all enhancements to DCF of one kind or another, predict low performance investors expect high performance, and vice versa. If they have experienced poor recent performance and see a low market, they expect this to continue and are unwilling to invest. If they see good recent performance and a high market they expect this to continue. Their expected return from investment will be systematically too high, or in other words they will suffer from short-termism.
But as applied to investments in preventing future death and disaster these techniques uniformly fail, partly because they undervalue human life, and partly because they underestimate the risk of death and disaster, because they cannot enumerate all the potential risks.

Peter Schwartz founded the Global Business Network, which has decades of experience running scenario planning exercises for major corporations. He:
has discovered that people are tempted to try to lock in on a single possible scenario that they prefer or see as the most likely and simply plan for that - defeating the purpose of scenario generation.
The purpose being, of course, to get planners to think about the long tail of "black swan" events.

The optimistic examples in this part are interesting, especially her account of the fight against the proposed Green Diamond development in the floodplain of Richland County, South Carolina, and Jared Watson's Eagle Scout project to educate the citizens of Mattapoisett, Massachusetts about the risk of flooding. But, as she recounts:
In each of these instances, a community's size, or at least its cultural continuity between past and present, has made it easier to create and steward collective heirlooms. Similarly, of the hundreds of stone markers dedicated to past tsunamis in Japan, the two that were heeded centuries later were both in small villages, where oral tradition and school education reinforced the history and passed down the warning over time.

Coda

Venkataraman finishes with an optimistic coda pointing to the work of Paul Bain and his colleagues who:
demonstrated that even climate deniers could be persuaded of the need for "environmental citizenship" if the actions to be taken, such as reducing carbon emissions, were framed as improvements in the way people would treat one another in the imagined future. A collective idea of the future in which people work together on environmental problems, and are more caring and considerate - or a future with greater economic and technological progress - motivated the climate change deniers to support such actions even when they didn't believe that human-caused climate change was a problem.
She enumerates the five key lessons she takes away from her work on the book:
  1. Look beyond near-term targets. We can avoid being distracted by short-term noise and cultivate patience by measuring more than immediate results.
  2. Stoke the imagination. We can boost our ability to envision the range of possibilities that lie ahead.
  3. Create immediate rewards for future goals. We can find ways to make what's best for us over time pay off in the present.
  4. Direct attention away from immediate urges. We can reengineer cultural and environmental cues that condition us for urgency and instant gratification.
  5. Demand and design better institutions. We can create practices, laws and institutions that foster foresight.

My Reaction

It is hard not to be impressed by the book's collection of positive examples, but it is equally hard not to observe that in each case there are great difficulties in scaling them up to match the threats we face.

And, in particular, there is a difficulty they all share that is inherent in Venkataraman's starting point:
I argue in this book that many decisions are made in the presence of information about future consequences but in the absence of good judgement.
The long history of first the tobacco industry's and subsequently the fossil fuel industry's massive efforts to pollute the information environment cast great doubt on the idea that "decisions are made in the presence of information about future consequences" if those consequences affect oligopolies. And research is only now starting to understand how much easier it is for those who have benefited from the huge rise in economic inequality to use social media to the same ends. As just one example:
Now researchers led by Penn biologist Joshua B. Plotkin and the University of Houston’s Alexander J. Stewart have identified another impediment to democratic decision making, one that may be particularly relevant in online communities.

In what the scientists have termed “information gerrymandering,” it’s not geographical boundaries that confer a bias but the structure of social networks, such as social media connections.

Reporting in the journal Nature, the researchers first predicted the phenomenon from a mathematical model of collective decision making, and then confirmed its effects by conducting social network experiments with thousands of human subjects. Finally, they analyzed a variety of real-world networks and found examples of information gerrymandering present on Twitter, in the blogosphere, and in U.S. and European legislatures.

“People come to form opinions, or decide how to vote, based on what they read and who they interact with,” says Plotkin. “And in today’s world we do a lot of sharing and reading online. What we found is that the information gerrymandering can induce a strong bias in the outcome of collective decisions, even in the absence of ‘fake news.’
In this light Nathan J. Robinson's The Scale Of What We're Up Against makes depressing reading:
It can be exhausting to realize just how much money is being spent trying to make the world a worse place to live in. The Koch Brothers are often mentioned as bogeymen, and invoking them can sound conspiratorial, but the scale of the democracy-subversion operation they put together is genuinely quite stunning. Jane Mayer, in Dark Money, put some of the pieces together, and found that the Charles Koch Foundation had subsidized “pro-business, antiregulatory, and antitax” programs at over 300 institutes of higher education. That is to say, they endowed professorships and think tanks that pumped out a constant stream of phony scholarship. They established the Mercatus Center at George Mason University, a public university in Virginia. All of these professors, “grassroots” groups, and think tanks are dedicated to pushing a libertarian ideology that is openly committed to creating a neo-feudal dystopia.
The Kochs provide just a small part of the resources devoted to polluting the information environment. Social networks, as the Cambridge Analytica scandal shows, have greatly improved the productivity of these resources.

I'm sorry to end my review of an optimistic book on a pessimistic note. But I'm an engineer, and much of engineering is about asking What Could Possibly Go Wrong?

2 comments:

David. said...

Facing the Great Reckoning Head-On, danah boyd's speech accepting one of this year's Barlow awards from the EFF, is a must-read. It is in its own way another plea for longer-term thinking:

"whether we like it or not, the tech industry is now in the business of global governance.

“Move fast and break things” is an abomination if your goal is to create a healthy society. Taking short-cuts may be financially profitable in the short-term, but the cost to society is too great to be justified. In a healthy society, we accommodate differently abled people through accessibility standards, not because it’s financially prudent but because it’s the right thing to do. In a healthy society, we make certain that the vulnerable amongst us are not harassed into silence because that is not the value behind free speech. In a healthy society, we strategically design to increase social cohesion because binaries are machine logic not human logic."

David. said...

Last October, Alex Nevala-Lee made the same point about Hari Seldon in What Isaac Asimov Taught Us About Predicting the Future:

"Asimov later acknowledged that psychohistory amounted to a kind of emotional reassurance: “Hitler kept winning victories, and the only way that I could possibly find life bearable at the time was to convince myself that no matter what he did, he was doomed to defeat in the end.” The notion was framed as a science that could predict events centuries in advance, but it was driven by a desire to know what would happen in the war over the next few months — a form of wishful thinking that is all but inevitable at times of profound uncertainty. Before the last presidential election, this impulse manifested itself in a widespread obsession with poll numbers and data journalism"