The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power

The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power Zuboff, Shoshana 1 Home or Exile in the Digital Future
  • Page 14 The Puppet Master, Not the Puppet
  • Page 16 technology is an expression of other interests.
  • Page 98 "What is Google?" Douglas Edwards recounts a 2001 session with the founders that probed their answers to that precise query. It was Page who ruminated, "If we did have a category, it would be personal information.... The places you've seen. Communications.... Sensors are really cheap.... Storage is cheap. Cameras are cheap. People will generate enormous amounts of data.... Everything you've ever heard or seen or experienced will become searchable. Your whole life will be searchable." 2
  • Page 164 "conversational commerce," 112 where, for example, a bot knows what shoes you bought last week, it knows your preferences from your past purchases, it knows your profile and can call a recommendations model to determine what products you have the most affinity to buy.... Using the power of data and analytics, the bot can respond back with recommendations that it determines are most relevant for you. It can also invite people from your social network to help you make a choice. Once you make the selection, it will use your size information, shipping address, payment information to ship the selected dress to you. 113
  • Page 173 Another example of surveillance-as-a-service [is] participation. "People will give up their privacy to get something they want"
  • Page 186 material to be accumulated and analyzed as means to others' market ends. The shadow text is a burgeoning accumulation of behavioral surplus and its analyses, and it says more about us than we can know about ourselves. Worse still, it becomes increasingly difficult, and perhaps impossible, to refrain from contributing to the shadow text. It automatically feeds on our experience as we engage in the normal and necessary routines of social participation.
  • Page 190 Under the regime of surveillance capitalism, the corporation's scientists are not recruited to solve world hunger or eliminate carbon-based fuels. Instead, their genius is meant to storm the gates of human experience, transforming it into data and translating it into a new market colossus that creates wealth by predicting, influencing, and controlling human behavior.
  • Page 190 We have come to take for granted that the internet enables an unparalleled diffusion of information, promising more knowledge for more people: a mighty democratizing force that exponentially realizes Gutenberg's revolution in the lives of billions of individuals. But this grand achievement has blinded us to a different historical development, one that moves out of range and out of sight, designed to exclude, confuse, and obscure.
  • Page 191 "Personal information is increasingly used to enforce standards of behavior. Information processing is developing, therefore, into an essential element of long-term strategies of manipulation intended to mold and adjust individual conduct." 34
  • Page 191 "The danger that the computer poses is to human autonomy. The more that is known about a person, the easier it is to control him.
  • Page 199 Mark Weiser's "The Computer for the 21st Century," which has framed Silicon Valley's technology objectives for nearly three decades. Weiser introduced what he called "ubiquitous computing" with two legendary sentences: "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it."
  • Page 200 The first wave of prediction products enabled targeted online advertising.
  • Page 201 it became clear that the best predictions would have to approximate observation.
  • Page 203 The aim of this undertaking is not to impose behavioral norms, such as conformity or obedience, but rather to produce behavior that reliably, definitively, and certainly leads to desired commercial results.
  • Page 213 "Because transactions are now computer-mediated we can observe behavior that was previously unobservable and write contracts on it," Varian says. "This enables transactions that were simply not feasible before." He gravitates to the example of "vehicular monitoring systems," recognizing their paradigmatic power. Varian says that if someone stops making monthly car payments, "Nowadays it's a lot easier just to instruct the vehicular monitoring system not to allow the car to be started and to signal the location where it can be picked up." 29 Insurance companies, he notes, can also rely on these monitoring systems to check if customers are driving safely and thus determine whether to maintain the insurance policy, vary the cost of premiums, and decide whether to pay a claim.
  • Page 213 "There are lots of people who are monetizing data today. You get on Google, and it seems like it's free. It's not free. You're giving them information; they sell your information.
  • Page 225 The image of technology as an autonomous force with unavoidable actions and consequences has been employed across the centuries to erase the fingerprints of power and absolve it of responsibility.
  • Page 226 "The changes and disruptions that an evolving technology repeatedly caused in modern life were accepted as given or inevitable simply because no one bothered to ask whether there were other possibilities." 70 Surveillance capitalist leaders assume that we will succumb to the naturalistic fallacy Google wants us to accept that its rules simply reflect the requirements of autonomous processes, something that people cannot control. ... Men and women made it, and they can control
  • Page 230 Optimized parking enforcement depends on Sidewalk's algorithms "to calculate the most lucrative routes for parking cops," earning cities millions of extra dollars that they desperately need but that arrive at the expense of their citizens.
  • Page 234 Technologies are designed to render our experience into data, as in rendering oil from fat. This typically occurs outside of our awareness, let alone our consent....Every time we encounter a digital interface we make our experience available to "datafication," thus "rendering unto surveillance capitalism" its continuous tithe of raw-material supplies.
  • Page 240 we are forced to purchase products that we can never own while our payments fund our own surveillance and coercion.
  • Page 241 All that is moist and alive must hand over its facts. There can be no shadow, no darkness. The unknown is intolerable. The solitary is forbidden.
  • Page 242 Most smartphone apps demand access to your location even when it's not necessary for the service they provide, simply because the answer to this question is so lucrative. ... "Come here now!" "Buy this here!" "An offer, just for you!" 22 Download the Starbucks app, and then leave your house if you want to see this in action. As one marketing consultancy advises, "Mobile advertising, the ultimate form of geo-targeting, is the holy grail of advertising."
  • Page 243 Most of us simply do not and cannot know the extent to which our phone doubles as a tracking device for corporate surveillance.
  • Page 246 In 2016 Chinese search engine Baidu, often referred to as the Google of China, announced that its "Big Data Lab" uses location data from its 600 million users to track and predict the dynamics of the Chinese economy.
  • Page 246 As powerful as location data are, wearable technologies and their applications are another significant proving ground in the act of body rendition. 39
  • Page 256 Instead of having to ask Google questions, it should "know what you want and tell you before you ask the question."
  • Page 256 "Google Now has to know a lot about you and your environment to provide these services.
  • Page 256 "Why am I willing to share all this private information?" he asks. "Because I get something in return.... These digital assistants will be so useful that everyone will want one."
  • Page 256 Varian reasons that inequality offers an opportunity to raise the ante on Google's quid pro quo for effective life. He counsels that the way to predict the future is to observe what rich people have because that's also what the middle class and the poor will want. "What do rich people have now?" he asks rhetorically. "Personal assistants."
  • Page 257 The luxuries of one generation or class become the necessities of the next has been fundamental to the evolution of capitalism during the last five hundred years. ... In 1767 the political economist Nathaniel Forster worried that "fashionable luxury" was spreading "like a contagion," and he complained of the "perpetual restless ambition in each of the inferior ranks to raise themselves to the level of those immediately above them." ... Varian casts personalization as a twenty-first-century equivalent of these historical dynamics, the new "necessaries" for the harried masses bent under the weight of stagnant wages, dual-career obligations, indifferent corporations, and austerity's hollowed-out public institutions. Varian's bet is that the digital assistant will be so vital a resource in the struggle for effective life that ordinary people will accede to its substantial forfeitures. "There is no putting the genie back in the bottle," Varian the inevitabilist insists. "Everyone will expect to be tracked and monitored, since the advantages, in terms of convenience, safety, and services, will be so great... continuous monitoring will be the norm." 6 Everyone, that is, except those wealthy or stubborn enough to achieve effective life without Google's assistance and thus escape the worst excesses of rendition. As decision rights and self-determination become privileges of the wealthy, what will Varian offer to answer those who clamor for the same?
  • Page 259 Online and offline behavioral surplus--your e-mail content, where you went this afternoon, what you said, what you did, how you felt--are combined into prediction products that can serve an emerging marketplace in which every aspect of your daily reality is up for bid. ...Facebook's vice president in charge of messaging products described the company's goals for M by saying, "We start capturing all of your intent from the things you want to do. Intent often leads to buying something, or to a transaction and that's an opportunity for us to [make money] over time." Most importantly, the VP stressed, "M learns from human behaviors." ... By 2017, Facebook had scaled back its machine intelligence ambitions and focused its personal assistant on the core mission: commerce. "The team in there now is finding ways to activate commercial intent inside Messenger," a Facebook executive reported. 12 The idea is to "prioritize commerce-driven experiences" and design new ways for users to "quickly buy things" without the tedium of entering credit card information, flipping pages, or opening applications.
  • Page 260 the "personal digital assistant" is revealed as a market avatar, another Trojan horse in which the determination to render and monetize your life is secreted under the veil of "assistance" and embellished with the poetry of "personalization."
  • Page 260 A digital assistant may derive its character from your inclinations and preferences, but it will be skewed and disfigured in unknown measure by the hidden market methods and contests that it conceals.
  • Page 261 An Amazon senior vice president comments on the company's voice-activated home devices: "The nice thing about the Amazon device business is that when we sell a device, generally people buy more blue jeans. And little black dresses. And shoes. And so that's good." "Voice shopping," he concludes, is good for business and good for predicting business. 15
  • Page 261 "conversation" turns the new personal digital assistant into a voice that sits between your life and the new markets for your life ...Smart-home devices such as Amazon's Echo or Google Home render rivers of casual talk from which sophisticated content analyses produce enhanced predictions that "anticipate" your needs. ... "We want users to have an ongoing, two-way dialogue with Google. We want to help you get things done in your real world and we want to do it for you," ... "For example, you can be in front of this structure in Chicago and ask Google, 'Who Designed This?' You don't need to say 'the bean' or 'the cloud gate.' We understand your context and we answer that the designer is Anish Kapoor."
  • Page 262 There was a time when you searched Google, but now Google searches you.
  • Page 264 Samsung acknowledges that the voice commands aimed at triggering the TV's voice-recognition capabilities are sent to a third party and adds, "Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition." 23
  • Page 267 The children will learn first that there are no boundaries between self and market. Later they will wonder how it could ever have been different.
  • Page 267 shift in focus from making great products for you to collecting great data about you.
  • Page 275 The price you are offered does not derive from what you write about but how you write it. ...It is not what is in your sentences but in their length and complexity, not what you list but that you list, not the picture ... but the choice of filter and degree of saturation, not what you disclose but how you share or fail to, not where you make plans to see your friends but how you do so: a casual "later" or a precise time and place? Exclamation marks and adverb choices operate as revelatory and potentially damaging signals of your self. ... few people understand that companies such as "Facebook, Snapchat, Microsoft, Google and others have access to data that scientists would not ever be able to collect." 65
  • Page 276 In his 2015 interview, Kosinski observed that "all of our interactions are being mediated through digital products and services which basically means that everything is being recorded."
  • Page 276 By early 2015, IBM announced that its Watson Personality Service was open for business. 68 The corporation's machine intelligence tools are even more complex and invasive than those used in most academic studies. In addition to the five-factor personality model, IBM assesses each individual across twelve categories of "needs," including "Excitement, Harmony, Curiosity, Ideal, Closeness, Self-expression, Liberty, Love, Practicality, Stability, Challenge, and Structure." It then identifies "values," defined as "motivating factors which influence a person's decision-making across five dimensions: Self-transcendence/ Helping others, Conservation/ Tradition, Hedonism/ Taking pleasure in life, Self-enhancement/ Achieving success, and Open to change/ Excitement." 69
  • Page 277 In a Twitter targeted-ad experiment, IBM found that it could significantly increase click-through rates and "follow" rates by targeting individuals with high "openness" and low "neuroticism" scores on the five-factor personality analysis.
  • Page 278 In this new world, paranoia and anxiety function as sources of protection from machine invasion for profit. Must we teach our children to be anxious and suspicious?
  • Page 280 influencing voters based not on their demographics but on their personalities...."
  • Page 282 "affective computing," "emotion analytics," and "sentiment analysis." The personalization project descends deeper toward the ocean floor with these new tools, where they lay claim to yet a new frontier of rendition trained not only on your personality but also on your emotional life.
  • Page 283 "Knowing the real-time emotional state can help businesses to sell their product and thereby increase revenue." 88
  • Page 284 Propaganda and advertising have always been designed to appeal to unacknowledged fears and yearnings. These have relied more on art than science, using gross data or professional intuition for the purpose of mass communication.
  • Page 285 1978 Ekman, along with frequent collaborator Wallace Friesen, published the seminal Facial Action Coding System (FACS), which provided that scheme.
  • Page 285 Rosalind Picard
  • Page 285 "affective computing."
  • Page 285 a computational system to automate the analysis of Ekman's facial configurations and correlate micro-expressions with their emotional causality.
  • Page 286 "computers can be given the ability to recognize emotions as well as a third-person human observer."
  • Page 289 "I think in the future we'll assume that every device just knows how to read your emotions." 113
  • Page 289 Emoshape,
  • Page 289 classify twelve emotions with up to 98 percent accuracy, enabling its "artificial intelligence or robot to experience 64 trillion possible distinct emotional states." 114
  • Page 289 it doesn't matter that your Fitbit doesn't have a camera, because your phone does, and your laptop does, and your TV will. All that data gets fused with biometrics from your wearable devices and builds an emotional profile for you."
  • Page 290 "I do believe that if we have information about your emotional experiences we can help you be in a positive mood," Kaliouby says.
  • Page 291 No matter how much is taken from me, this inward freedom to create meaning remains my ultimate sanctuary. Jean-Paul Sartre writes that "freedom is nothing but the existence of our will," and he elaborates: "Actually it is not enough to will; it is necessary to will to will." 117 This rising up of the will to will is the inner act that secures us as autonomous beings who project choice into the world and exercise the qualities of self-determining moral judgment that are civilization's necessary and final bulwark.
  • Page 291 What happens when they come for my "truth" uninvited and determined to march through my self, taking the bits and pieces that can nourish their machines to reach their objectives? Cornered in my self, there is no escape.
  • Page 294 Sensors are used to modify people's behavior just as easily as they modify device behavior. There are many great things we can do with the internet of things, like lowering the heat in all the houses on your street so that the transformer is not overloaded, or optimizing an entire industrial operation. But at the individual level, it also means the power to take actions that can override what you are doing or even put you on a path you did not choose.
  • Page 294 three key approaches to economies of action,
  • Page 294 The first two I call "tuning" and "herding." The third is already familiar as what behavioral psychologists refer to as "conditioning."
  • Page 294 Subliminal cues ... Richard Thaler and Cass Sunstein call the "nudge," which they define as "any aspect of a choice architecture that alters people's behavior in a predictable way."
  • Page 295 "Herding" is a second approach that relies on controlling key elements in a person's immediate context. ... foreclosing action alternatives ... "We are learning how to write the music, and then we let the music make them dance," ... We can engineer the context around a particular behavior and force change that way. Context-aware data allow us to tie together your emotions, your cognitive functions, your vital signs, etcetera. We can know if you shouldn't be driving, and we can just shut your car down.
  • Page 296 modification should mimic the evolutionary process, in which naturally occurring behaviors are "selected" for success by environmental conditions. ... Skinner called the application of reinforcements to shape specific behaviors "operant conditioning." ... "Conditioning at scale is essential to the new science of massively engineered human behavior." ... As digital signals monitor and track a person's daily activities, the company gradually masters the schedule of reinforcements--rewards, recognition, or praise that can reliably produce the specific user behaviors that the company selects for dominance:
  • Page 298 Data tell what happened but not why it happened. In the absence of causal knowledge, even the best predictions are only extrapolations from the past. ... As Varian says, "If you really want to understand causality, you have to run experiments. And if you run experiments continuously, you can continuously improve your system." 4 ... surveillance capitalists declare their right to modify others' behavior for profit according to methods that bypass human awareness, individual decision rights, and the entire complex of self-regulatory processes that we summarize with terms such as autonomy and self-determination.
  • Page 301 unprecedented power to persuade, influence, and ultimately manufacture behavior. ... This time the experimenters "manipulated the extent to which people (N = 689,003) were exposed to emotional expressions in their News Feed." 9 The experiment was structured like one of those allegedly benign A/ B tests. In this case one group was exposed to mostly positive messages in their news feed and the other to predominantly negative messages. The idea was to test whether even subliminal exposure to specific emotional content would cause people to change their own posting behavior to reflect that content. It did. Whether or not users felt happier or sadder, the tone of their expression changed to reflect their news feed. ... "Online messages influence our experience of emotions, which may affect a variety of offline behaviors." The team celebrated its work as "some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network,"
  • Page 303 "the Facebook study paints a dystopian future in which academic researchers escape ethical restriction by teaming up with private companies to test increasingly dangerous or harmful interventions." 16
  • Page 305 "By monitoring posts, pictures, interactions, and Internet activity, Facebook can work out when young people feel 'stressed,' 'defeated,' 'overwhelmed,' 'anxious,' 'nervous,' 'stupid,' 'silly,' 'useless,' and a 'failure.'" 20
  • Page 306 outcomes. "Anticipatory emotions are more likely to be expressed early in the week," the analysis counsels, "while reflective emotions increase on the weekend. Monday– Thursday is about building confidence; the weekend is for broadcasting achievements."
  • Page 307 "Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness."
  • Page 307 Individual awareness is the enemy of telestimulation because it is the necessary condition for the mobilization of cognitive and existential resources. There is no autonomous judgment without awareness. Agreement and disagreement, participation and withdrawal, resistance or collaboration: none of these self-regulating choices can exist without awareness.
  • Page 308 Every threat to human autonomy begins with an assault on awareness, "tearing down our capacity to regulate our thoughts, emotions, and desires." 22 ... a scale to measure a person's "susceptibility to persuasion." They found that the single most important determinant of one's ability to resist persuasion is what they call "the ability to premeditate." 23 ... people who harness self-awareness to think through the consequences of their actions are more disposed to chart their own course and are significantly less vulnerable to persuasion techniques. ... People who are consciously committed to a course of action or set of principles are less likely to be persuaded to do something that violates that commitment. ... Human consciousness itself is a threat to surveillance revenues,
  • Page 314 Ian Bogost, a professor of interactive computing at Georgia Tech and a digital culture observer, insists that these systems should be called "exploitationware" rather than games because their sole aim is behavior manipulation and modification. 36
  • Page 328 The commodification of behavior
  • Page 339 Surveillance capitalists' interests have shifted from using automated machine processes to know about your behavior to using machine processes to shape your behavior according to their interests.