This book review was written by Eugene Kernes
“Eventually, companies began to explain these violations as the necessary quid pro quo for “free” internet services. Privacy, they said, was the price one must pay for the abundant rewards of information, connection, and other digital goods when, where, and how you want them. These explanations distracted us from the sea change that would rewrite the rules of capitalism and the digital world.” – Shoshana Zuboff, Chapter 2: August 9, 2011: Setting the Stage for Surveillance Capitalism, Page 62
“Surveillance capitalism lays claim to these decision rights. The typical complaint is that privacy is eroded, but that is misleading. In the larger societal pattern, privacy is not eroded but redistributed, as decision rights over privacy are claimed for surveillance capital. Instead of people having the rights to decide how and what they will disclose, these rights are concentrated within the domain of surveillance capitalism. Google discovered this necessary element of the new logic of accumulation: it must assert the rights to take the information upon which its success depends.” – Shoshana Zuboff, Chapter 3: The Discovery of Behavioral Surplus, Page 101
“The theory and practice of dispossession were developed and refined as the company learned how to counter and transform public resistance as an essential condition for the protection and expansion of its behavioral surplus franchise.” – Shoshana Zuboff, Chapter 5: The Elaborate Of Surveillance Capitalism: Kidnap, Corner, Compete, Page 154
Within surveillance capitalism, digital users provide their experiences as data to be used to modify their behavior. Data used by digital firms, to change user behavior for financial reward. Data used to be used to enhance user experience, but has become the source of profit. Not just profit, but also political power. As the data can be used to find people that need persuading. Power is held with those who hold the data. The way the data is gathered and used threatens what it means to be human, democracy, and people’s sovereignty.
Data is sold to other businesses, and rendered into behavioral predictions. The way they get user data, is through covert means. Spying on everyday lives without consent. Gathering and modifying personal data. Claimed to be anonymous, but can be used to identify the person. Digital firms claim that privacy is the cost of using the seemingly free products. Privacy violations that are enabled through legal terms of agreement. Without a way to opt out of the privacy violations, especially because society has become dependent on using digital means to communicate and engage with others. Privacy for use of products is not a legitimate choice anyone should make. Especially because the privacy violations are not actually needed to run the applications.
Dependency On The Digital Realm:
Digital interactions have become ubiquitous and familiar, but without much reflection on what these interactions mean. The digital realm created a networked world that enabled capabilities and prospects, but came with negative psychological consequences. Life has become dependent on the services that the internet provides, but to use the internet requires destruction of various human values. Cognitive dissonance occurs with the use of the digital realm, for it provides a lot of value, but at a very large personal and social cost. An illegitimate choice that has become normalized.
Applications have purposely found ways to become addictive. As people engage with them in higher frequencies, and for longer durations. Becoming a compulsion. Meant for relief, but generates anxiety. Technology has accelerated various socialization developments, become a necessity for social participation, and are used for the sake of connection. Social pressure generates a tendency to over-share.
Data, Security, And A Choice:
Earlier digital applications acknowledged the data the was being produced, with the data usage being in control of the user. Data initially was used for the user. Emphasizing the sovereignty of the individual, and places of sanctuary within private domains. Data that did not cost the customers anything, while the data enabled better experiences of products and the expansion of available products.
Data has become a tool of oppression. Individuals no longer have privacy or security with their personal information. There is a lack of accountability for data security. Data that is being used and sold for predictive analysis. Refusal to adhere to individual data use, means risking service and product functionality. Risking the safety of the individual.
The products and services provided by surveillance capitalism do not have a value exchange. Their producer-consumer reciprocities are not constructive. Those who use free products are not customers, for there is no economic exchange, nor are the users working for the firm. Unlike workers who are paid for their efforts, users of free digital products are not paid for their efforts. Users are not customers, nor the product, they are raw material used to create surplus. The products and services are created to tempt users, who then become part of the program to extractive user personal experiences for other peoples wants. The products made are meant to predict behavior. People are the source of raw-material supply of the data used to make the predictions.
Privacy violations have become explained as necessary for the free internet services. Privacy is the price for access to information and other products. Explanations that distract from the even further violations within the digital realm. A dispossession developed and refined to better able counter and transform public resistance into protection and expansion of the behavioral surplus operations. The way digital services change behavior is by the cycle of incursion, habituation, adaptation, and redirection.
Privacy is redistributed rather than eroded. Rights over privacy are now concentrated with digital providers. These rights are what enable surveillance capitalism’s success. Using language to hide how they are using the rights over privacy. This means the loss of individual’s sovereignty, for they do not have control over their own data.
Whether or not an individual chooses to engage with digital application, digital firms will engage with them. People will go to other people’s private homes, to engage with the digital application.
Surveillance capitalism claims the right to use human experience as a free raw material, for the purpose of behavioral data and modification. Outcomes of the data use are proprietary behavioral surplus, used to predict what the individual will do.
The competitive process with surveillance capitalism pressures the firms to continuously obtain more data, from more sources. Then use that data to modify behavior towards profitable outcomes. A digital framework that knows and shapes behavior at scale. An attempt to automate individuals, and society. What the digital firms want, is to know the individual better than they know themselves.
Surveillance capitalism does not come about due to technological inevitabilities, but because of capitalistic logic. The firms make surveillance appear inevitable, when it is actually just a means for commercial ends that favors the firms.
Surveillance capitalists exploitation of the data they gathered, is explained in terms of emancipation to sway the generated anxieties. But the processes they use to obtain the data, is hidden. They keep power through ignorance. Surveillance capitalism’s power comes through information asymmetry. They know everything about the individual, but their operations are not made know to others. They have gathered information from each individual, to use not for the individual, but for someone else.
Digital firms found ways to legitimize and legalize their incursion into user experiences. They legitimate their claims with obscure and incomprehensible terms-of-service agreements. Even reading the abusive contract would require far longer than people actually read the contract. Without accepting terms of service, would mean loss of updates for functionality and security. Accepting some apps, gives them permission to collect and modify sensitive information. Such as calling private numbers, and accessing the camera for identification purpose. Calls are recorded, and given to third-party firms to review how the voice renders into text. To improve the voice system algorithms. The recordings are claimed to be anonymous, but people are sharing very sensitive information, that can be used to identify them. Apps collude with other apps covertly. Activating an app, triggers a variety of other tracking apps.
Application creates choice architecture to elicit specific behavior. To experiment on behavior modification for profit, and without human awareness. Academic and government experiments need to comply with set rules to prevent abuses. Which includes review boards. While private digital firms go beyond what is acceptable under the rules, and are more likely to have conflicts of interest. Digital firms behavior goes beyond established law and social norms.
Google has a patent for targeted advertising. That they have the rights over users’ personal information. Rights that were held by users in the original social contract. The patent made Google an active agent in data gathering. Google’s digital targeted advertising led to financial success, but was also transformed into an automated auction.
The expropriation of experience depends on the laws making sure it is legal. Changing the laws of surveillance, would make the surveillance capitalism model unsustainable. Surveillance firms fight hard prevents laws that threaten their access to free behavioral surplus.
Content distributors and publisher are under different legal systems. While publishers are liable for defamation posts, distributors are not. Applications that do not review content posted, tend to be seen as a distributor. Companies that did set standards for the content and removed posts that violated the standards, were deemed to take responsibility for the content and therefore considered a publisher. This created a no-win situation. The more a company would protect the users from malicious content, the more responsibility for the content the company would have. Either benefiting free speech or scoundrels. Section 230 was meant to resolve that contradiction, by allowing some control over content, without the risk of legal repercussions. This contradiction does not much apply to surveillance capitalism. The content providers data is now being used to render into behavioral data that leads to product sales. Section 230’s protection of intermediaries now protects the surveillance operations from examination.
Technology Leaders, And Politicians:
Politicians have chosen to attach themselves to internet providers leaders to appear as willing to make a change. But that proximity is a threat to every other internet provider. Google’s leadership’s contact with the presidency, threatened Google’s competitors. Providing technical support and taking part in the electoral cycle. With the help of the digital realm, the campaigns knew everyone who they needed to persuade, and along with their personal and private social data.
Tech industry, specifically Google, is a major contributor to political lobbying efforts. They use their efforts to prevent legislation that would impede their extraction of behavioral surplus.
Surveillance capitalism can be used by governments as well. For political purposes, rather than market ones. A forfeiture of freedom, for knowledge that is used by the state.
A technology of behavior has the potential to reject the idea of freedom. Technology that can harmonize human behavior. Giving up freedom for guaranteed outcomes. Freedom requires the individual’s to choose how to develop themselves, not behavioral modification programs.
The book can be difficult to read, especially because of the ideological origins of the ideas. Simplifying and misdirecting some of the ideology unto wrong targets. The author uses language in the same way the author claims the digital firms use language to persuade people to give up their privacy rights.
The focus of the book is about the wrongs of behavioral modification. But behavioral modification is not always against the individual. As the author makes the case, what makes behavioral modification acceptable is a legitimate choice made by the user. While the way behavioral modification occurs is through covert means. What is limited in this book, are practical ways to identify the covert means that firms use. Practical ways to identify the inappropriate behavioral modification.