The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World

  • Page 3 Like many, I had initially assumed social media's dangers came mostly from misuse by bad actors - propagandists, foreign agents, fake- news peddlers - and that at worst the various platforms were a passive conduit for society's preexisting problems. But virtually everywhere I traveled in my reporting, covering far- off despots, wars, and upheavals, strange and extreme events kept getting linked back to social media. A sudden riot, a radical new group, widespread belief in some oddball conspiracy - all had a common link.
  • Page 5 the more incendiary the post, they sensed, the more widely the platforms spread it.
  • Page 7 Many at the company seemed almost unaware that the platform's algorithms and design deliberately shape users' experiences and incentives, and therefore the users themselves.
  • Page 8 Within Facebook's muraled walls, though, belief in the product as a force for good seemed unshakable.
  • Page 9 attraction to divisiveness," the researchers warned in a 2018 presentation later leaked to the Wall Street Journal. In fact, the presentation continued, Facebook's systems were designed in a way that delivered users "more and more divisive content in an effort to gain user attention & increase time on the platform." Public figures routinely referred to the companies as one of the gravest threats of our time. In response, the companies' leaders pledged to confront the harms flowing from their services.
  • Page 10 They unveiled election- integrity war rooms and updated content- review policies. But their business model - keeping people glued to their platforms as many hours a day as possible - and the underlying technology deployed to achieve this goal remained largely unchanged. commissioned by the company under pressure from civil rights groups, concluded that the platform was everything its executives had insisted to me it was not. Its policies permitted rampant misinformation that could undermine elections. Its algorithms and recommendation systems were "driving people toward self- reinforcing echo chambers of extremism," training them to hate. Perhaps most damning, the report concluded that the company did not understand how its own products affected its billions of users.
  • Page 11 The early conventional wisdom, that social media promotes sensationalism and outrage, while accurate, turned out to drastically understate things. This technology exerts such a powerful pull on our psychology and our identity, and is so pervasive in our lives, that it changes how we think, behave, and relate to one another. The effect, multiplied across billions of users, has been to change society itself.
  • Page 12 With little incentive for the social media giants to confront the human cost to their empires - a cost borne by everyone else, like a town downstream from a factory pumping toxic sludge into its communal well - it would be up to dozens of alarmed outsiders and Silicon Valley defectors to do it for them.
  • Page 14 "If you joined the one anti- vaccine group," she said, "it was transformative." Nearly every vaccine- related recommendation promoted to her was for anti- vaccine content. "The recommendation engine would push them and push them and push them."
  • Page 15 Before long, the system prompted her to consider joining groups for unrelated conspiracies. Chemtrails. Flat Earth.
  • Page 15 The reason the system pushed the conspiratorial outliers so hard, she came to realize, was engagement. Social media platforms surfaced whatever content their automated systems had concluded would maximize users' activity online, thereby allowing the company to sell more ads.
  • Page 16 Facebook wasn't just indulging anti- vaccine extremists. It was creating them. Almost certainly, no one at Facebook or YouTube wanted to promote vaccine denial.
  • Page 17 But the technology building this fringe movement was driven by something even the company's CEO could not overcome: the cultural and financial mores at the core of his entire industry.
  • Page 20 As semiconductors developed into the circuit board, then the computer, then the internet, and then social media, each technology produced a handful of breakout stars, who in turn funded and guided the next handful.
  • Page 22 human instincts to conform run deep. When people think something has become a matter of consensus, psychologists have found, they tend not only to go along, but to internalize that sentiment as their own. the outrage was being ginned up by the very Facebook product that users were railing against. That digital amplification had tricked Facebook's users, and even its leadership, into misperceiving the platform's loudest voices as representing everyone, growing a flicker of anger into a wildfire.
  • Page 23 But, crucially, it had also done something else: driven engagement up. Way up.
  • Page 24 Long after their technology's potential for harm had been made clear, the companies would claim to merely serve, and never shape or manipulate, their users' desires. But manipulation had been built into the products from the beginning. "The thought process that went into building these applications," Parker told the media conference, "was all about, ‘How do we consume as much of your time and conscious attention as possible?'" "We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you more likes and comments."
  • Page 25 the "social- validation feedback loop," The term of art is "persuasion": training consumers to alter their behavior in ways that serve the bottom line. Stanford University had operated a Persuasive Tech Lab since 1997. In 2007, a single semester's worth of student projects generated $ 1 million in advertising revenue.
  • Page 26 Dopamine is social media's accomplice inside your brain. It's why your smartphone looks and feels like a slot machine, pulsing with colorful notification badges, whoosh sounds, and gentle vibrations. Social apps hijack a compulsion - a need to connect - that can be even more powerful than hunger or greed. intermittent variable reinforcement.
  • Page 27 Never knowing the outcome makes it harder to stop pulling the lever. Intermittent variable reinforcement is a defining feature of not only gambling and addiction but also, tellingly, abusive relationships. while posting to social media can feel like a genuine interaction between you and an audience, there is one crucial, invisible difference. Online, the platform acts as unseen intermediary. It decides which of your comments to distribute to whom, and in what context. The average American checks their smartphone 150 times per day, often to open social media.
  • Page 28 YEAR AFTER launching the news feed, a group of Facebook developers mocked up something they called the "awesome button" - a one- click expression of approval for another user's post. After a year and a half in limbo, a new team took over what was now the "Like" button.
  • Page 29 That little button's appeal, and much of social media's power, comes from exploiting something called the sociometer. The anguish we feel from low self- esteem is wholly self- generated. self- esteem is in fact "a psychological gauge of the degree to which people perceive that they are relationally valued and socially accepted by other people." It's what the anthropologist Brian Hare called "survival of the friendliest." The result was the development of a sociometer: a tendency to unconsciously monitor how other people in our community seem to perceive us.
  • Page 30 the platforms added a powerful twist: a counter at the bottom of each post indicating the number of likes, retweets, or upvotes it had received - a running quantification of social approval for each and every statement.
  • Page 30 When we receive a Like, neural activity flares in a part of the brain called the nucleus accumbens: the region that activates dopamine.
  • Page 31 Expressing identity, sharpening identity, seeing and defining the world through its lens. This effect remade how social media works, as its overseers and automated systems drifted toward the all- consuming focus on identity that best served their agendas.
  • Page 32 Our drive to cultivate a shared identity is so powerful that we'll construct one even out of nothing.
  • Page 33 Prejudice and hostility have always animated this instinct. Hunter- gatherer tribes sometimes competed for resources or territory. Social media's indulgence of identity wasn't obviously harmful at first. But it was always well known.
  • Page 34 In 2014, I was one of several Washington Post reporters to start Vox, a news site intended to leverage the web. We never shaped our journalism to please social media algorithms - at least, not consciously - but headlines were devised with them in mind. The most effective approach, though one that in retrospect we should have perhaps been warier of using, was identity conflict. Liberals versus conservatives. The righteousness of anti- racism. The outrageousness of lax gun laws. "Few realized, early on, that the way to win the war for attention was to harness the power of community to create identity. Two: Everything Is Gamergate
  • Page 47 Raucous debate became seen as the purest meritocracy: if you couldn't handle your own or win over the crowd, if you felt harassed or unwelcome, it was because your ideas had not prevailed on merit.
  • Page 49 Peter Thiel, a founder of PayPal and the first outside investor in Facebook, had urged elevating antisocial contrarians. "If you're less sensitive to social cues, you're less likely to do the same things as everyone else around you." "There's not a lot of value placed on social niceties," Margaret O'Mara told me. "There's a tolerance for weirdness, in part because weird people have a proven track record. That's the other dimension of Silicon Valley culture. It's like everyone was an asshole."
  • Page 50 Thiel, further parlaying his PayPal success, started a fund that launched major investments in Airbnb, Lyft, and Spotify. Throughout, like many leading investors, he imposed his ideals on the companies he oversaw.
  • Page 51 with the advent of the social media era, the industry was building its worst habits into companies that then smuggled those excesses - chauvinism, a culture of harassment, majoritarianism disguised as meritocracy - into the homes and minds of billions of consumers.
  • Page 51 the norms and values that they'd encoded into the early web turned out to guide its millions of early adopters toward something very different than the egalitarian utopia they'd imagined.
  • Page 52 4chan. Anytime a user wanted to start a new thread, they had to upload an image, which kept the platform filled with user- made memes and cartoons.
  • Page 52 Long before Snapchat and others borrowed the feature, discussions automatically deleted after a brief period, which enabled unseemly behavior that might've been shunned elsewhere. So did the site's anonymity; nearly all posts are marked as written by "Anonymous," which instills an anything- goes culture and a sense of collective identity that can be alluring, especially to people who crave a sense of belonging.
  • Page 54 "Ultimately," Christopher Poole, 4chan's founder, said in 2008, "the power lies in the community to dictate its own standards." The internet's promise of total freedom appealed especially to kids, for whom off- line life is ruled by parents and teachers. Adolescents also have a stronger drive to socialize than adults, which manifests as heavier use of social networks and a greater sensitivity to what happens there. Poole had started 4chan when he was just fifteen. Kids who felt isolated off- line, like Adam, drove an outsized share of online activity, bringing the concerns of the disempowered and the bullied with them.
  • Page 55 Transgressing ever- greater taboos - even against cruelty to grieving parents - became a way to signal that you were in on the joke. "When you browse 4chan and 8chan while the rest of your friends are posting normie live- laugh- love shit on Instagram and Facebook," Adam said, "you feel different. Cooler. Part of something niche." These two unifying activities, flaunting taboos and pulling pranks, converged to become trolling. The thrill of getting a reaction out of someone even had a name: lulz, a corruption of the acronym for "laugh out loud."
  • Page 56 Unchastened by the social constraints of the off- line world, each user operates like a miniature Facebook algorithm, iteratively learning what best wins others' attention. One lesson consistently holds. To rise among tens of thousands of voices, regardless of what you post, it is better to amp up the volume, to be more extreme.
  • Page 57 "Trolling is basically internet eugenics,"
  • Page 59 From the beginning, social media platforms borrowed heavily from video games. Notifications are delivered in stylized "badges," which Gordon told the audience could double a user's time on site, while likes mimic a running score. This was more than aesthetic. Many platforms initially considered gamers - tech obsessives who would surely pump hours into this digital interface, too - to be a core market.
  • Page 60 New TV programming like My Little Pony and GI Joe delivered hyper- exaggerated gender norms, hijacking adolescents' natural gender self- discovery and converting it into a desire for molded plastic products. Tapping into our deepest psychological needs, then training us to pursue them through commercial consumption that will leave us unfulfilled and coming back for more, has been central to American capitalism since the postwar boom. Marketers, having long positioned games as childhood toys, kept boys hooked through adolescence and adulthood with - what else? - sex.
  • Page 62 Senator Trent Lott of Mississippi. His staff had deployed a now- famous push poll: "Do you believe Democrats are trying to take away your culture?" It performed spectacularly, especially with white men.
  • Page 63 Facebook, in the hopes of boosting engagement, began experimenting with breaking the so- called Dunbar limit. The British anthropologist Robin Dunbar had proposed, in the 1990s, that humans are cognitively capped at maintaining about 150 relationships. Our behavior changes, too, seeking to reset back to 150, like a circuit breaker tripping. Even online, people converged naturally on Dunbar's number. Users were pushed toward content from what Facebook called "weak ties": friends of friends, contacts of contacts, cousins of cousins. Enforced through algorithmic sophistication, the scheme worked. Facebook pulled users into ever expanding circles of half- strangers, surpassing the Dunbar limit.
  • Page 64 But studies of rhesus monkeys and macaques, whose Dunbar- like limits are thought to mirror our own, had found that pushing them into larger groups made them more aggressive, more distrusting, and more violent. The monkeys seemed to sense that safely navigating an unnaturally large group was beyond their abilities, triggering a social fight- or- flight response that never quite turned off. They also seemed to become more focused on forming and enforcing social hierarchies, likely as a kind of defense mechanism.
  • Page 64 Facebook could push you into groups - stand- alone discussion pages focused on some topic or interest - ten times that size.
  • Page 65 "There's this conspiracy- correlation effect," DiResta said, "in which the platform recognizes that somebody who's interested in conspiracy A is typically likely to be interested in conspiracy B, and pops it up to them." "I called it radicalization via the recommendation engine," she said. "By having engagement- driven metrics, you created a world in which rage- filled content would become the norm." The algorithmic logic was sound, even brilliant. Radicalization is an obsessive, life- consuming process. Believers come back again and again, their obsession becoming an identity, with social media platforms the center of their day- to- day lives. She had seen it over and over. Recruits were drawn together by some ostensibly life- or- death threat: the terrible truth of vaccines, the Illuminati agents who spread Zika, the feminists seeking to overturn men's rightful place atop the gender hierarchy, starting with gaming.
  • Page 68 Still, Reddit was built and governed around the same early internet ideals as 4chan, and had absorbed that platform's users and cultural tics. Its up- or- down voting enforced an eclipsing majoritarianism that pushed things even further. upvote counts are publicly displayed, tapping into users' sociometer- driven impulse for validation. The dopamine- chase glued users to the site and, as on Facebook, steered their actions. As of 2016, four years after her suit, still only 11 percent of technology venture- capital partners were women. Two percent were Black.
  • Page 68 looked like them: in 2018, 98 percent of their investment dollars went to male- led companies.
  • Page 70 "Every Man Is Responsible for His Own Soul." This would become a standard defense from social media overlords: that the importance of their revolution compelled them to disregard the petty laws and morals of the outmoded off- line world. Besides, any bad behavior was users' fault, no matter how crucial a role the platform played in enabling, encouraging, and profiting from those transgressions.
  • Page 71 Finally, nearly three weeks after the photos first appeared, Wong banned them. Reddit's users, incensed, accused the platform of selling out its principles to shadowy corporate influence and, worse, feminists.
  • Page 72 Pao was also testing a theory: that the most hateful voices, though few in number, exploited social media's tendency to amplify extreme content for its attention- winning power, tingeing the entire platform in the process.
  • Page 73 The first ban was small: a subreddit called "FatPeopleHate." Still, Reddit's userbase erupted in anger at the removals as an attack on the freedom to offend and transgress that, after all, had been an explicit promise of the social web since its founding.
  • Page 74 "The trolls are winning," Pao wrote in a Washington Post op- ed a few days later. The internet's foundational ideals, while noble, had led tech companies to embrace a narrow and extreme interpretation of free speech that was proving dangerous, she warned. She had lasted just eight months.
  • Page 75 MILO YIANNOPOULOS, Headlines like "Lying Greedy Promiscuous Feminist Bullies Are Tearing the Video Game Industry Apart" went viral on those platforms as seeming confirmation. His bosses had hoped his articles would inform Breitbart's small, far- right readership on tech issues. Instead, they tapped into a new and much larger audience that they hadn't even known existed - one that was only coming together at that moment. "Every time you write one of your commentaries, it gets 10,000 comments," Steve Bannon, Breitbart's chief, told Yiannopoulos on the site's radio show. "It goes even broader than the Breitbart audience, all over."
  • Page 76 Within three years, the angry little subculture Yiannopoulos championed would evolve into a mainstream movement so powerful that he was granted a keynote slot at the Conservative Political Action Conference, the most important event on the political right. (The invitation was later revoked.) Bannon called their cause the "alt right," a term borrowed from white- power extremists who'd hoped to rebrand for a new generation. Bannon and others on the alt right saw a chance to finally break through. "I realized Milo could connect with these kids right away," Bannon said later. "You can activate that army. They come in through Gamergate or whatever and then get turned on to politics and Trump." "They call it ‘meme magic' - when previously obscure web memes become so influential they start to affect real- world events," Yiannopoulos wrote that summer before the election. The movement coalesced around Trump, who had converged on the same tics and tactics as Yiannopoulos and other Gamergate stars, and for seemingly the same reason: it's what social media rewarded.
  • Page 78 He swung misinformation and misogyny as weapons. He trolled without shame, heaping victims with mockery and abuse. He dared society's gatekeepers to take offense at flamboyant provocations that were right off 4chan. From May 2015, a month before Trump declared his candidacy, to November 2016, a Harvard study later found, the most popular right- wing news source on Facebook was Breitbart, edging out even Fox News. Awed outsiders would long ascribe Breitbart's rise to dark- arts social media manipulation. In truth, the publication did little more than post its articles to Facebook and Twitter, just as it always had. It was, in many ways, a passive beneficiary. Facebook's systems were promoting a host of once- obscure hyperpartisan blogs and outright misinformation shops - bearing names like The Gateway Pundit, Infowars, The Conservative Treehouse, and Young Cons - into mega- publishers with the power to reshape reality for huge segments of the population.
  • Page 80 "This cycle of aggrievement and resentment and identity, and mob anger, it feels like it's consuming and poisoning the entire nation." Four: Tyranny of Cousins
  • Page 85 "We enjoy being outraged. We respond to it as a reward." The platforms had learned to indulge the outrage that brought their users "a rush - of purpose, of moral clarity, of social solidarity." The growing pace of these all- consuming meltdowns, perhaps one a week, indicated that social media was not just influencing the broader culture, but, to some extent, supplanting
  • Page 87 Popular culture often portrays morality as emerging from our most high- minded selves: the better angels of our nature, the enlightened mind. Sentimentalism says it is actually motivated by social impulses like conformity and reputation management (remember the sociometer?), which we experience as emotion. the emotional brain works fast, often resolving to a decision before conscious reason even has a chance to kick in. social purpose, like seeking peers' approval, rewarding a Good Samaritan, or punishing a transgressor. But the instinctual nature of that behavior leaves it open to manipulation. Which is exactly what despots, extremists, and propagandists have learned to do, rallying people to their side by triggering outrage - often at some scapegoat or imagined wrongdoer. What would happen when, inevitably, social platforms learned to do the same?
  • Page 89 Much legal scholarship, Klonick knew, considers public shaming necessary for society to function: tut- tutting someone for cutting in line, shunning them for a sexist comment, getting them fired for joining a hate group. But social media was changing the way that public shaming worked, which would necessarily change the functioning of society itself. "Low cost, anonymous, instant, and ubiquitous access to the internet has removed most - if not all - of the natural checks on shaming," she wrote of her findings, "and thus changed the way we perceive and enforce social norms."
  • Page 92 Truth or falsity has little bearing on a post's reception, except to the extent that a liar is freer to alter facts to conform to a button- pushing narrative. What matters is whether the post can provoke a powerful reaction, usually outrage. A 2013 study of the Chinese platform Weibo found that anger consistently travels further than other sentiments.
  • Page 93 Right or left, the common variable was always social media, the incentives it imposes, the behavior it elicits. Our social sensitivity evolved for tribes where angering a few dozen comrades could mean a real risk of death. On social media, one person can, with little warning, face the fury and condemnation of thousands.
  • Page 97 pleasurable. Brain scans find that, when subjects harm someone they believe is a moral wrongdoer, their dopamine- reward centers activate. From behind a screen, far from our victims, there is no pang of guilt at seeing pain on the face of someone we've harmed. Nor is there shame at realizing that our anger has visibly crossed into cruelty.
  • Page 98 the platform's extreme bias toward outrage meant that misinformation prevailed, which created demand for more outrage- affirming rumors and lies.
  • Page 99 scales; people express more outrage, and demonstrate more willingness to punish the undeserving, when they think their audience is larger.
  • Page 101 algorithmically encouraged rage. Five: Awakening the Machine
  • Page 106 "In September 2011, I sent a provocative email to my boss and the YouTube leadership team," Goodrow later wrote. "Subject line: ‘Watch time, and only watch time.' It was a call to rethink how we measured success." second. "Our job was to keep people engaged and hanging out with us,"
  • Page 108 YouTube's system seeks something more far- reaching than a monthly subscription fee. Its all- seeing eye tracks every detail of what you watch, how long you watch it, what you click on next. It monitors this across two billion users, accruing what is surely the largest dataset on viewer preferences ever assembled, which it constantly scans for patterns. Chaslot and others tweaked the system as it went, nudging its learning process to better accomplish its goal: maximum watch time.
  • Page 109 One of the algorithm's most powerful tools is topical affinity. If you watch a cat video all the way through, Chaslot explained, YouTube will show you more on return visits. The effect is to pull users toward ever more titillating variations on their interests.
  • Page 115 Focus everything, he instructed, on maximizing a few quantifiable metrics. Concentrate power in the hands of engineers who can do it. And shunt aside the rest.
  • Page 116 They were chasing a very specific model: free- to- use web services that promised breakneck user growth.
  • Page 117 in the late 2000s, Amazon and a few others set up sprawling server farms, putting their processing power and data storage up for rent, calling it "the cloud." Now you no longer needed to invest in overhead. You rented it from Amazon, uploading your website to their servers. "Forget strategy," the investor Roger McNamee wrote of this new approach. "Pull together a few friends, make a product you like, and try it in the market. Make mistakes, fix them, repeat." It was transformative for investors, too, who no longer had to sink millions into getting a startup to market. They could do it for pocket change.
  • Page 119 If the value of an ad impression kept shrinking, even the Facebooks and YouTubes might cease to be viable. Their only choice was to permanently grow the number of users, and those users' time on site, many times faster than those same actions drove down the price of an ad. But controlling the market of human attention, as their business models had fated them to attempt, was beyond anything a man- made program could accomplish.
  • Page 120 Wojcicki's YouTube existed to convert eyeballs into money. Democracy and social cohesion were somebody else's problem. "So, when YouTube claims they can't really say why the algorithm does what it does, they probably mean that very literally." The average user's time on the platform skyrocketed. The company estimated that 70 percent of its time on site, an astronomical share of its business, was the result of videos pushed by its algorithm- run recommendation system.
  • Page 121 In 2014, the same year that Wojcicki took over YouTube, Facebook's algorithm replaced its preference for Upworthy- style clickbait with something even more magnetic: emotionally engaging interactions. Across the second half of that year, as the company gradually retooled its systems, the platform's in- house researchers tracked 10 million users to understand the effects. They found that the changes artificially inflated the amount of pro- liberal content that liberal users saw and the amount of pro- conservative content that conservatives saw. Just as Pariser had warned. The result, even if nobody at Facebook had consciously intended as much, was algorithmically ingrained hyperpartisanship. The process, Facebook researchers put it, somewhat gingerly, in an implied warning that the company did not heed, was "associated with adopting more extreme attitudes over time and misperceiving facts about current events."
  • Page 123 TikTok, a Chinese- made app, shows each user a stream of videos selected almost entirely by algorithms. Its A.I. is so sophisticated that TikTok almost immediately attracted 80 million American users, who often use it for hours at a time, despite most of its engineers not speaking English or understanding American culture.
  • Page 124 Like DiResta's anti- vaxxers, or even Upworthy, the Russians hijacked the algorithm's own preferences. It wasn't just that the agents repeated phrases or behaviors that performed well. Their apparent mission, of stirring up political discord, seemed to naturally align with what the algorithms favored anyway, often to extremes.
  • Page 125 "He was telling me, ‘Oh, but there are so many videos, it has to be true,'" Chaslot said. "What convinced him was not the individual videos, it was the repetition. And the repetition came from the recommendation engine." illusory truth effect. We are, every hour of every day, bombarded with information. To cope, we take mental shortcuts to quickly decide what to accept or reject. One is familiarity; if a claim feels like something we've accepted as true before, it probably still When he searched YouTube for Pope Francis, for instance, 10 percent of the videos it displayed were conspiracies. On global warming, it was 15 percent. But the real shock came when Chaslot followed algorithmic recommendations for what to watch next, which YouTube has said accounts for most of its watch time. A staggering 85 percent of recommended videos on Pope Francis were conspiracies, asserting Francis's "true" identity or purporting to expose Satanic plots at the Vatican.
  • Page 128 But the influence of algorithms only deepened, including at the last holdout, Twitter. For years, the service had shown each user a simple, chronological feed of their friends' tweets. Until, in 2016, it introduced an algorithm that sorted posts - for engagement, of course, and to predictable effect. "The recommendation engine appears to reward inflammatory language and outlandish claims."
  • Page 129 Shortly after Twitter algorithmified, Microsoft launched an A.I.- run Twitter account called Tay. The bot operated, like the platforms, on machine learning, though with a narrower goal: to converse convincingly with humans by learning from each exchange. Within twenty- four hours, Tay's tweets had taken a disturbing turn. "You absolutely do NOT let an algorithm mindlessly devour a whole bunch of data that you haven't vetted even a little bit." Six: The Fun House Mirror
  • Page 132 Conspiracy belief is highly associated with "anomie," the feeling of being disconnected from society.
  • Page 136 It was undeniable that Trump owed his rise to nondigital factors, too: the institutional breakdown of the Republican Party, a decades- long rise in polarization and public distrust, white backlash to social change, a radicalized right- wing electorate. Social media had created none of these. But, in time, a network of analysts and whistleblowers would prove that it had exacerbated them all, in some cases drastically.
  • Page 138 moral outrage can become infectious in groups, and that it can alter the mores and behaviors of people exposed to it. across topics, across political factions, what psychologists refer to as "moral- emotional words" consistently boosted any tweet's reach. Moral- emotional words convey feelings like disgust, shame, or gratitude. calls for, communal judgment, That makes these words different from either narrowly emotional sentiments (" Overjoyed at today's marriage equality ruling") or purely moral ones (" The president is a liar"), for which Brady's effect didn't appear. Tweets with moral- emotional words, he found, traveled 20 percent farther - for each moral- emotional word.
  • Page 139 Brady found something else. When a liberal posted a tweet with moral- emotional words, its reach substantially increased among other liberals, but declined with conservatives. (And vice versa.) It won the user more overall attention and validation, in other words, at the cost of alienating people from the opposing side. Proof that Twitter encouraged polarization.
  • Page 143 They were acting on a widely held misinterpretation of something known as contact theory. Coined after World War II to explain why desegregated troops became less prone to racism, the theory suggested that social contact led distrustful groups to humanize one another. But subsequent research has shown that this process works only under narrow circumstances: managed exposure, equality of treatment, neutral territory, and a shared task. Simply mashing hostile tribes together, researchers repeatedly found, worsens animosity. People, as a rule, perceive out- groups as monoliths.
  • Page 144 Even in its most rudimentary form, the very structure of social media encourages polarization. Reading an article and then the comments field beneath it, an experiment found, leads people to develop more extreme views on the subject in the article. Control groups that read the article with no comments became more moderate and open- minded. News readers, the researchers discovered, process information differently when they are in a social environment: social instincts overwhelm reason, leading them to look for affirmation of their side's righteousness.
  • Page 148 The data revealed, as much as any foreign plot, the ways that the Valley's products had amplified the reach, and exacerbated the impact, of malign influence. (She later termed this "ampliganda," a sort of propaganda whose power comes from its propagation by masses of often unwitting people.)
  • Page 149 Over many iterations, the Russians settled on a strategy. Appeal to people's group identity. Tell them that identity was under attack. Whip up outrage against an out- group. And deploy as much moral- emotional language as possible.
  • Page 152 the internet offered political outsiders a way around the mainstream outlets that shunned them. As those candidates' grassroots supporters spent disproportionate time on YouTube, the system learned to push users to those videos, creating more fans, driving up watch time further. But thanks to the preferences of the algorithms for extreme and divisive content, it was mostly fringe radicals who benefited, and not candidates across the spectrum.
  • Page 154 The platforms, they concluded, were reshaping not just online behavior but underlying social impulses, and not just individually but collectively, potentially altering the nature of "civic engagement and activism, political polarization, propaganda and disinformation." They called it the MAD model, for the three forces rewiring people's minds. Motivation: the instincts and habits hijacked by the mechanics of social media platforms. Attention: users' focus manipulated to distort their perceptions of social cues and mores. Design: platforms that had been constructed in ways that train and incentivize certain behaviors.
  • Page 156 As psychologists have known since Pavlov, when you are repeatedly rewarded for a behavior, you learn a compulsion to repeat it. As you are trained to turn all discussions into matters of high outrage, to express disgust with out- groups, to assert the superiority of your in- group, you will eventually shift from doing it for external rewards to doing it simply because you want to do it. The drive comes from within. Your nature has been changed. The second experiment demonstrated that the attention economy, by tricking users into believing that their community held more extreme and divisive views than it really did, had the same effect. Showing subjects lots of social media posts from peers that expressed outrage made them more outrage- prone themselves. It was a chilling demonstration of how portraying people and events in sharply moral- emotional terms brings out audiences' instincts for hatred and violence - which is, after all, exactly what social platforms do, on a billions- strong scale, every minute of every day. Eight: Church Bells
  • Page 181 the rumors activated a sense of collective peril in groups that were dominant but felt their status was at risk - majorities angry and fearful over change that threatened to erode their position in the hierarchy. status threat. When members of a dominant social group feel at risk of losing their position, it can spark a ferocious reaction. They grow nostalgic for a past, real or imagined, when they felt secure in their dominance (" Make America Great Again").
  • Page 182 We don't just become more tribal, we lose our sense of self. It's an environment, they wrote, "ripe for the psychological state of deindividuation." surrendering part of your will to that of the group. deindividuation, with its power to override individual judgment, and status threat, which can trigger collective aggression on a terrible scale.
  • Page 185 Anti- refugee sentiment is among the purest expressions of status threat, combining fear of demographic change with racial tribalism.
  • Page 186 There's a term for the process Pauli described, of online jokes gradually internalized as sincere. It's called irony poisoning. Heavy social media users often call themselves "irony poisoned," a joke on the dulling of the senses that comes from a lifetime engrossed in social media subcultures, where ironic detachment, algorithmic overstimulation, and dare- to- offend humor prevail. Desensitization makes the ideas seem less taboo or extreme, which in turn makes them easier to adopt.
  • Page 188 defining traits and tics of superposters, mapped out in a series of psychological studies, are broadly negative. One is dogmatism: "relatively unchangeable, unjustified certainty." Dogmatics tend to be narrow- minded, pushy, and loud. Another: grandiose narcissism, defined by feelings of innate superiority and entitlement.
  • Page 189 Narcissists are consumed by cravings for admiration and belonging, which makes social media's instant feedback and large audiences all but irresistible. Nine: The Rabbit Hole
  • Page 203 "Really the only place where they could exchange their thoughts and coalesce and find allies was online." These groups didn't reflect real- world communities of any significant size, he realized. They were native to the web - and, as a result, shaped by the digital spaces that had nurtured them. Climate skeptics largely gathered in the comments sections of newspapers and blogs. There, disparate contrarians and conspiracists, people with no shared background beyond a desire to register their objection to climate coverage got clumped together. It created a sense of common purpose.
  • Page 208 By January 2018, Kaiser was mounting enough evidence to begin slowly going public. He told a Harvard seminar that the coalescing far right of which the Charlottesville gathering was a part was "not done by users," he was coming to believe, at least not entirely, but had been in part "created through the YouTube algorithm."
  • Page 209 Canadian psychology professor. In 2013, Peterson began posting videos addressing, amid esoteric Jungian philosophy, youth male distress.
  • Page 209 YouTube searches for "depression" or certain self- help keywords often led to Peterson. His videos' unusual length, sixty minutes or more, align with the algorithm's drive to maximize watch time. So does his college- syllabus method of serializing his argument over weeks, which requires returning for the next lecture and the next. Michael Kimmel calls "aggrieved entitlement."
  • Page 210 YouTube's algorithm, in many cases, tapped into that discontent, recommending channels that took Peterson's message to greater and greater extremes. Users who comment on Peterson's videos subsequently become twice as likely to pop up in the comments of extreme- right YouTube channels, a Princeton study found. the algorithm makes the connection. The scholar J. M. Berger calls it "the crisis- solution construct." When people feel destabilized, they often reach for a strong group identity to regain a sense of control.
  • Page 211 Incel forums had begun as places to share stories about feeling lonely. Users discussed how to cope with living "hugless." But the norms of social media one- upmanship, of attention chasing, still prevailed. The loudest voices rose.
  • Page 212 By 2021, fifty killings had been claimed by self- described incels, a wave of terrorist violence.
  • Page 212 The movement was a fringe of a fringe, dwarfed by Pizzagate or the alt right. But it hinted at social media's potential to galvanize young white male anomie into whole communities of extremism - an increasingly widespread phenomenon.
  • Page 214 These channels were her "YouTube friends," salve for a lost marriage and feelings of isolation. They were community. They were identity.
  • Page 214 YouTube's system, they found, did three things uncannily well. it stitched together wholly original clusters of channels. There was nothing innate connecting these beyond the A.I.' s conclusion that showing them alongside one another would keep users watching.
  • Page 215 YouTube's recommendations generally moved toward the more extreme end of whatever network the user was in. the third discovery. the system's recommendations were clustering mainstream right- wing channels, and even some news channels, with many of the platform's most virulent hatemongers, incels, and conspiracy theorists.
  • Page 216 One channel sat conspicuously in the network's center, a black hole toward which YouTube's algorithmic gravity pulled: Alex Jones.
  • Page 216 Rauchfleisch warned, "Being a conservative on YouTube means that you're only one or two clicks away from extreme far- right channels, conspiracy theories, and radicalizing content."
  • Page 219 Jack Dorsey, Twitter's CEO, "We didn't fully predict or understand the real- world negative consequences" of launching an "instant, public, global" platform, he wrote that March. He conceded that it had resulted in real harms. He began, in interviews, voluntarily raising heretical ideas that other tech CEOs continued to fervently reject: maximizing for engagement is dangerous; likes and retweets encourage polarization. The company, he said, would reengineer its systems to promote "healthy" conversations rather than engaging ones. He hired prominent experts and research groups to develop new features or design elements to do it.
  • Page 220 dangerous. "They told me, ‘People click on Flat Earth videos, so they want a Flat Earth video,'" he recalled. "And my point was, no, it's not that because someone clicked on the Flat Earth video, he wants to be lied to. He is just curious, and there is a clickbait title. But to the algorithm, when you watch a video, it means you endorse it."
  • Page 221 YouTube, by showing users many videos in a row all echoing the same thing, hammers especially hard at two of our cognitive weak points - that repeated exposure to a claim, as well as the impression that the claim is widely accepted, each make it feel truer than we would otherwise judge it to be.
  • Page 221 The post went on for twenty more lines. References just cryptic enough that users could feel like they were cracking a secret code, and obvious enough to ensure that they would.
  • Page 222 Followers got more than a story. QAnon, as the movement called itself, became a series of online communities where believers gathered to parse Q's posts. Extremist groups have long recruited on a promise to fulfill adherents' need for purpose and belonging. Conspiracies insist that events, rather than uncontrollable or impersonal, are all part of a hidden plot whose secrets you can unlock. Reframing chaos as order, telling believers they alone hold the truth, restores their sense of autonomy and control. It's why QAnon adherents often repeat to one another their soothing mantra: "Trust the plan."
  • Page 224 But for all the feelings of autonomy, security, and community that QAnon offered, it came at a cost: crushing isolation.
  • Page 225 It was one of the things that made QAnon so radicalizing. Joining often worsened the very sense of isolation and being adrift that had led people to it in the first place. With nowhere else to turn and now doubly needful of reassurance, followers gave themselves over to the cause even more fully.