Tuesday, June 2, 2020

Informational Capitalism

In The Law of Informational Capitalism, Prof. Amy Kapczynski of the Yale Law School reviews two books, Shoshana Zuboff’s The Age of Surveillance Capitalism and Julie Cohen’s Between Truth and Power: The Legal Constructions of Informational Capitalism to document the legal structures on which the FAANGs and other "big tech" companies depend for their power.

Below the fold, some commentary on her fascinating article.

Kapczynski starts by critiquing Zuboff's book:
The book is extraordinarily acute in its grasp of the business models and aspirations of the largest internet firms and describes in exquisite detail why they are deeply troubling. ... Zuboff is right that our autonomy and individuality are today at risk in new ways. But she has little to say about the monopoly power of new platforms, or about their role in reshaping labor markets and intensifying forms of inequality. She ignores the fact that we are not all equally vulnerable to these new forms of power. ... Given the manifesto-like quality of the book, it is something of a shock when you realize that Zuboff’s dream is a world dominated by firms like Apple, instead of firms like Google. That view, once uncovered, has little appeal, nor does it help us think about many of the extraordinarily important modes of private power facilitated by information technologies today.
Then Cohen's:
Cohen argues that we live not in an age of "surveillance" capitalism — which trains our focus on dynamics of surveillance and behavioral control—but in an age of "informational capitalism" — which focuses our attention on informationalism as a broader mode of development in the contemporary political economy. Her broader framework captures transformations across a much wider range of settings and calls attention not only to Zuboff’s instrumentarian power but also to rising platform power, monopoly power, and the power that technology can give capital over workers and governments over the governed.
And brings out what she sees as Cohen's key contribution, showing:
how these changes are mediated at every moment by law: for example, law has enabled de facto property regimes in both data and algorithms, although neither are formally property.
Based on these reviews, Kapczynski proceeds to:
construct an account of the “law of informational capitalism,” with particular attention to the law that undergirds platform power. Once we come to see informational capitalism as contingent upon specific legal choices, we can begin to consider how democratically to reshape it. Though Cohen does not emphasize it, some of the most important legal developments — specifically, developments in the law of takings, commercial speech, and trade — are those that encase private power from democratic revision. Today’s informational capitalism brings a threat not merely to our individual subjectivities but to equality and our ability to self-govern. Questions of data and democracy, not just data and dignity, must be at the core of our concern.
She quotes Cohen:
Information technologies are highly configurable, and their configurability offers multiple points of entry for interested and well-resourced parties to shape their development. To understand what technology signifies for the future of law, we must understand how the design of networked information technologies within business models reflects and reproduces economic and political power.
She describes Cohen's account of what the "interested and well-resourced parties" did to intellectual property law. First, patents and copyrights:
U.S. law shifted to allow corporations to own patents and copyrights, to control their employees’ creations, and to claim follow-on or derivative creations and innovations as new forms of property. The purported aim of these laws also shifted over time from the advancement of learning and the arts, which gave significant priority to the diffusion of knowledge, to an emphasis on innovation incentives, which prioritized the perspective of producers.
Second, trademarks:
Trademark law creates key assets for the informational economy as well, anchoring processes of branding and advertising. This area of law also expanded in the 1990s, moving beyond the protection of consumers from confusion to create a more robust and property-like entitlement to the goodwill associated with a brand, especially for famous marks. The Chicago School made its mark here too, theorizing that trademarks did not just protect from confusion but minimized “search costs,” a rationale that enabled companies to claim far more capacious rights in marks. Other economic theories redescribed corporate marketing and branding activities not as unsavory or wasteful attempts at mind control but as signals of “quality,” reasoning that only well-capitalized firms could afford to invest in their brands. Law changed to relax restrictions on “naked licensing” of trademarks, enabling forms of franchising that permitted corporations like McDonald’s to formally disavow employee relationships with hundreds of thousands of people whose work they intricately controlled. This enabled the creation of the informational service sector, where franchisors could both exert networked control over franchisees and avoid the strictures of labor law.
But in the networked economy these traditional forms of intellectual property are not the most important:
What are the primary sources of economic power for Google, Facebook, the algorithmic financial sector, and the projected new data overlords that will revolutionize medical care, the criminal-justice system, education, and more? Two key sources are trade-secret rights and contract law.
As regards their use of contracts:
Contractual and trade-secrecy claims work in conjunction with firms’ technical control over the network Platforms can deploy contracts with their vendors, customers, and collaborators that require data and algorithms to be kept secret or not shared because they inhabit privileged technical positions at the nodes of networks that millions of people want to access. These contracts deny users control over their data or any access to the companies’ valuable secrets. They are “boilerplates” that cannot in practice be amended by users, making them a “powerful tool both for private ordering of behavior and for private re- ordering of even the most bedrock legal rights and obligations.” Some firms use terms-of-use agreements that also forbid users from undertaking research that might disclose aspects of their platform’s functioning. The impact of these contracts is dramatically amplified by overbroad laws like the federal Computer Fraud and Abuse Act, which render certain violations of such terms-of-use agreements criminal.
Again, "interested and well-resourced parties" were at work altering the law in their favor:
Although Cohen does not discuss it, changes in contract and trade-secrecy law were essential to platforms’ ability to anchor this new power. Without changes in the law of contracts that blessed digital “click-wrap” agreements, platform power could not have evolved as it has. The subject matter of trade-secret law has also expanded dramatically over the decades, from a narrow tort-based right to prevent competitors from stealing formulas and employees to an expansive property-like right to any valuable and secret commercial information. No one has yet written a full history of these developments that situates them in the rise of incentive-based accounts of IP and corporate efforts to expand their protection over data. But evidence of key inflection points can be found in the expansions in the types of information protectable as trade secrets: from the early 1939 Restatement (First) of Torts definition of trade secrets, to the notably broader definitions in the Uniform Trade Secrets Act of 1979 and the Restate- ment (Third) of Unfair Competition of 1995, and then to the federalization of trade-secrets law in the Defend Trade Secrets Act of 2016.
Note the importance to the rise of Silicon Valley of the California law rendering "non-compete" agreements void.

And, of course, the Chicago School's destruction of anti-trust was a huge advantage to the large platforms. Because they don't charge their users, their significant monopoly abuses are much less visible, and affect primarily advertisers or vendors so much smaller that legal remedies would be effectively foreclosed, even if they were not forced into arbitration by click-through licenses:
Changes in antitrust law in the 1980s and 1990s were important because they narrowed the field’s focus to price effects, making the power of platforms hard to see and challenge. The power of platforms was also enhanced by internet intermediary law and the lack of meaningful consequences for data or other security breaches that threaten users.
See, for example, Equifax. Regulations played their part too:
The Federal Trade Commission enabled personal-data capture by deciding to apply only a thin conception of “unfair trade practices” that authorized data harvesting wherever notice and contractual consent could be shown. The fact that a privacy statement ran into the thousands of pages made no difference.
Kapczynski sums up the effect of these changes in the legal environment in which the FAANGs and others operate:
Through the interaction of these various background entitlements, platforms today enjoy rights to seize data flows. Google and Facebook—but also apps, smart appliances, medical intermediaries, and so forth—occupy particularly powerful nodes in networks that permit them to harvest data in a manner that others cannot. Does that mean that this is a “lawless” domain? Hardly. Since at least the time of Wesley Newcomb Hohfeld, it has been understood that a regime that permits first-comers to seize an asset is also a regime of law. The coding of data as free for the gathering is just as much a rule of law as a property regime in data would be. Cohen likens this regime to the legal construction of the public domain—a place defined by the “absence of prior claims to the resource in question.” She also notes that the public-domain treatment of data works alongside other legal rules that protect platforms (contract law, intermediary immunity, and the like), making the conventional refrain of intellectual- property scholars that there is no property in data formally true but practically false.
I agree with Kapczynski that many early advocates of open access to technology, such as John Perry Barlow, exaggerated the potential it had in the real world:
Cohen also gestures, subtly, to another critical fact: arguments coming from “copyleft” scholars and other progressives skeptical of strong intellectual-property law and who celebrated “peer-to-peer” production helped further some of these same ideas, though with no intention to bolster the corporate power that has benefitted from them (in fact quite the opposite). Some of the most influential voices valorizing and naturalizing innovation were left-libertarian tech utopians. Eben Moglen, a law professor at Columbia and an important theorist of free software, famously insisted that human creativity, like electrons, simply flowed anytime people were connected in networks. It reflected a broader move at the time by many like Moglen: they hailed the creative potential unleashed by innovation and pivoted to a demand for simple openness, because at the time the most important obstacle to creativity and flourishing to many appeared to be overly restrictive intellectual-property law.

As Yochai Benkler described it recently, the most ambitious version of this argument suggested that “winning political battles over free software or open source hardware could make people better able to live independent lives than winning political battles over labor or employment law.” The dream that open software could free us all and that one could “hack” the broader sociopolitical system by demanding openness at a certain technological layer, now seems painfully, obviously wrong.
Yes, but as we celebrate the life of Larry Kramer, it seems appropriate to point out that moderation in advocacy isn't a way to move the Overton Window or force change on the world. Their advocacy did change the world in many beneficial ways. Kapczynski shows she doesn't understand this when she writes:
Today, for example, open-source software is fully integrated into Google’s Android phones. The volunteer labor of thousands thus helps power Google’s surveillance-capitalist machine. As Cohen has pointed out, freedom at one layer in the stack often ends up meaning that control is simply exerted elsewhere, with “free” inputs subsumed into a broader apparatus of control.
Actually, Apple's products also derive from open source software (FreeBSD). The operating systems that power most of the cloud and most of the Internet of Things, all derive from the work of the Unix community in the days before open source licenses. That is where the culture of openly sharing one's work developed, thanks to anti-trust enforcement and the AT&T consent decree.

But this wasn't "the volunteer labor of thousands". In order to work on Unix you needed to be a member of an organization that had signed the Unix license. All contributors were paid for their work by their organization in one way or another. That was certainly true of me and everyone else working to develop the X Window System; we were paid by MIT or Digital Equipment or, in my case, Sun Microsystems. What was true of our efforts 34 years ago is still true today:
It is worth noting that, even if one assumes that all of the “unknown” contributors are working on their own time, well over 85 percent of all kernel development is demonstrably done by developers who are being paid for their work.

Interestingly, the volume of contributions from unpaid developers has been in slow decline for many years. It was 14.6 percent in the 2012 version of this report, but is 8.2 percent this time around. There are many possible reasons for this decline, but, arguably, the most plausible of those is quite simple: kernel developers are in short supply, so anybody who demonstrates an ability to get code into the mainline tends not to have trouble finding job offers. Indeed, the bigger problem can be fending those offers off. As a result, volunteer developers tend not to stay that way for long.
The companies that fund Linux and other open source developments make a strategic decision that doing so is in their long-term interest. And so do the contributors who truly are volunteers, in that a record of contributions increases their value in the employment market. It is definitely the case that there are gaps in this support, important infrastructure components dependent on the labor of individual volunteers. But even those put-upon developers are happy to see their work valued and used.

Kapczynski is right that there is much to dislike about the world of the FAANGs, but she would dislike even more a world in which each of them had their own proprietary network protocols, operating system and other infrastructure layers. The fact that they don't is due in large part to the advocates she criticizes.

2 comments:

  1. Cory Doctorow has a Twitter thread announcing the publication, in full, online of How To Destroy Surveillance Capitalism, his critique of Zuboff's The Age of Surveillance Capitalism"

    "I wrote "How to Destroy..." after reading Zuboff's book and realizing that while I shared her alarm about how Big Tech was exercising undue influence over us, I completely disagreed with her thesis about the source of that influence and what should be done about it.

    Zuboff calls surveillance capitalism a "rogue capitalism," a system that has used machine learning to effectively control our minds and shape our behavior so that we can no longer serve as market actors whose purchase decisions promote good firms and products over bad ones.

    Because of that, Big Tech has a permanent advantage, one that can't be addressed through traditional means like breakups or consent decrees, nor can it be analyzed through traditional privacy lenses.

    But I think that's wrong. It's giving Big Tech far too much credit. I just don't buy the thesis that Big Tech used Big Data to create a mind-control ray to sell us fidget spinners, and that Cambridge Analytica hijacked it to make us all racists.

    So I wrote "How to Destroy Surveillance Capitalism," a short book that delivers a different thesis: Big Tech is a monopoly problem. In fact, it's just a part of a wider monopoly problem that afflicts every sector of our global economy."

    Go read the thread and then the book.

    ReplyDelete