Reconceiving our political situation due to technology?

March 29, 2020

Here's a just-so story: Liberalism emerged when bourgeois economics enabled (some) individuals to have enough material wealth, and thus power, to stand up to the state and its staff, which mostly meant in early modern Europe, a struggle between the nobility and the Bourgeoisie.  Liberalism was about securing "freedom," and the main danger was the state.  There was always subsidiary attention to other forms of consolidated institutional power, but the centrality of the state was always granted.  Perhaps now we have to reconceive the dangers, to see corporate power as in some ways outranking governmental power. At least, that's the worry that this post suggests.

 

Shoshana Zuboff’s The Age of Surveillance Capitalism got a bit of attention in the fall, when it was published, but I think it actually deserves a great deal more.  This piece by Tim Wu—author of The Attention Merchants, another fine piece—makes a case for why it deserves more attention, and more follow-on discussion.

As Wu points out, Zuboff’s claims are pretty dramatic, but also pretty plausible:

Silicon Valley has invented, if not yet perfected, the technology that completes [B.F.] Skinner’s [behaviorist] vision, and so, she believes, the behavioral engineering of humanity is now within reach. 

and

If “industrial capitalism depended upon the exploitation and control of nature,” then surveillance capitalism, she writes, “depends instead upon the exploitation and control of human nature.” 

This is really good—a new picture of what is happening in our world, and a new set of terms by which to describe it.

She accomplishes this in two ways, he thinks: first, she gives us a new vision of the world we inhabit, most fundamentally by giving us a new language to bring it into view,

a vocabulary that captures the significance of tech surveillance. Her best coinage is almost certainly the title of the book, but there are others of note, like “prediction products”—items that employ user data to “anticipate what you will do now, soon, and later” and then are traded in “behavioral futures markets”; or “the extraction imperative,” which is her phrase for what motivates firms to collect as much behavioral and personal data as possible. The “dispossession cycle” is the means, for Zuboff, by which this is accomplished. Of the essential amorality of the tech industry, she says, dryly, that “friction is the only evil.” 

this helps, as he puts it, “tell us something about the relationship between capitalism and totalitarian systems of control.” Indeed it does—by giving us a vocabulary for bringing into view the kind of world we’re in these days, certain things become unmissable for us.

 

But second, as Wu points out, Zuboff makes a profound claim in political theory, about where our normative concern should be.  Up till now, liberalism has been understood to be designed to secure privacy against the state.  As Wu puts it:

 

If private spaces for every individual were once (say, in the sixteenth century) only something the rich had, the spread of wealth to a propertied middle class and the building of homes with separate rooms (the invention of “upstairs”) is what made it plausible for legal thinkers like Louis Brandeis to speak of the masses enjoying a right to privacy, to be unwatched—a right to be “let alone.” It is not surprising that we don’t begin to see the legal idea of privacy form until the eighteenth century, with the spread of private spaces in which one could conceal oneself from “the unwanted gaze,” whether it belonged to neighbors or government.

Central to this institutionalization of privacy was the free market (Wu says “capitalism,” but I want to go with the free market).  

 

Here's my account of how this went down: We used to think that totalitarianism would come in the form of government control, that the state as an institution was the kind of thing powerful enough always and perpetually to threaten our rights.  Thus, what we needed to do was to protect ourselves against intrusion from the state.  We did that by constructing a robust concept of privacy, and institutions to secure the reality of what that concept promised.  Central among those institutions was a free-market system to deliver wealth and power and property to individuals, which wealth and power and property served (among other things) as bulwarks against government intrusion.

But now, things, as Bob Dylan put it, have changed:

what we’re learning is that the symbiosis between capitalism and privacy was maybe just a phase, a four-hundred-year fad. For capitalism is an adaptive creature, a perfect chameleon; it has no disabling convictions but seeks only profit. If privacy pays, great, but if totalizing control pays more, then so be it.

In a capitalist system, the expected level of privacy can actually be captured by one single equation. Is there more money to be made through surveillance or through the building of walls? For a long time, the answer was walls, because walls made up houses and other forms of private property.

What has happened is that the economic conditions have changed:

Today, the balance has shifted. There is still money in building walls, but the surveillance industries must be counted as among the most significant parts of the economy. Surveillance is at the center of the business models of firms like Google and Facebook, and a part of Amazon, Uber, Lyft, and others. 

What we see is an emergent new form, not just of economic wealth, but also of political power:

This form of power, according to her, does not depend on coercion or terror, as under a dictatorial system, but “ownership of the means of behavioral modification.” In other words, she thinks that the future belongs to whoever is running the Skinner boxes.

These companies have the capacity to take away our freedom in a deep way, Zuboff and Wu think.  They can manipulate us, and control us, and dominate us, in ways that make us less our own self-sovereigns, and more their instruments--instruments whereby they can redirect wealth to their coffers, via our wallets.  In fact I would say the issue here is a larger one still, touched on also by Elizabeth Anderson's recent book Private Government.  It is the issue of whether we appropriately understand the "political" landscape we exist in.  All of these works suggest we do not.

Wu thinks Zuboff offers too conspiratorial and paranoid a reading of this, and his critique on that point seems fair to me; it’s easy for us to personalize problems and then moralize them—to think they are the consequence of deliberate action by deliberate actors—and from there we easily offer paranoid accounts of how things go bad.  Zuboff’s book can reasonably be seen as suffering from this malady.  And Wu’s alternative narrative is grim enough, anyway; as he says, he thinks it’s “a bit less Doctor Doomian and more Faustian”:

[Google’s] role in the rise of surveillance capitalism is therefore a story of a different set of human failings: a certain blindness to consequence, coupled with a dangerous desire to have it all. 

What’s his proposal?  Interestingly, it’s an appeal to an accountable public structure, in other words, what the GOP would have us call big government :

The protection of human freedom can no longer be thought of merely as a matter of traditional civil rights, the rights to speech, assembly, and voting that we’ve usually taken as the bedrocks of a free society. What we most urgently need is something else: protection against widespread behavioral control and advanced propaganda techniques. And that begins with completely rethinking how we control the collection of data.

That will require not a privacy statute, as some might imagine, but a codified antisurveillance regime. We need, in other words, laws that prevent the mass collection of behavioral data. 

[…this] would stop the gratuitous surveillance and the reckless accumulation of personalized data. It would do that by allowing only the collection of data necessary to the task at hand: an app designed to help you mix cocktails would not, for example, be allowed to collect location data. Gratuitous surveillance would be banned—and after collecting data, firms would be forced, by default, to get rid of it, or fully anonymize the rest of it.

There are two final thoughts I have, for the moment, about all of this.

First: it seems plausible to see this set of concerns as about the bounds of what counts as the political.  In that sense, the debate here is one that Marxism has been trying to have with liberalism ever since The Jewish Ideology.  And I think that that text's critique of liberalism still bears some listening to.  Whether this new form of info-capitalism will make all of us Marxists is not the real issue--what is the real issue is the degree to which we should all be worried about this, and the degree to which our worries may be sharpened and interconnected by looking at the language Marx offers us.

 

Second: there's a deep metaphysical issue at stake here.  At the last minute of the Wu article, there’s the kind of metaphysical puzzle that law professors think fun to drop at the end of articles but only because they’ve never taken them seriously (I'm looking at you, Jeb Rubenfeld):

What we have learned, what Skinner and secret police alike have realized, is this: to know everything about someone is to create the power to control that person. We may not be there yet, but there is a theoretical point—call it the Skinnerlarity—where enough data will be gathered about humanity to predict, with some reasonable reliability, what everyone on earth will do at any moment. That accomplishment would change the very structure of experience. As the legal scholar Jonathan Zittrain has said, it would make life “a highly realistic but completely tailored video game where nothing happens by chance.”

That’s why we must dare to say what would sound like blasphemy in another age. It may be that a little less knowledge is what will keep us free.

I wonder if this is true.  (Thinking more here would involve thinking harder about issues most recently opened up in public discourse by Thaler and Sunstein’s book Nudge, though in fact the questions go back to Huxley's Brave New World.)  Are we the kinds of creatures who can be so modified?  Hannah Arendt argued, in the "Epilogue" to her Origins of Totalitarianism, that what totalitarianism showed is that "human nature is at stake" in the politics of the twentieth century.  Maybe what we're seeing is something connected to that.

 

A scary thought.

 

(P.S. I know his name is "Jed" not "Jeb," but he comes off as such a tool I can't help but troll him.)