The Chaos Machine: The Inside Story of How Social Media Rewired Our Minds and Our World
Page 3
Like many, I had initially assumed social media's dangers came mostly from misuse by bad actors -- propagandists, foreign agents, fake- news peddlers -- and that at worst the various platforms were a passive conduit for society's preexisting problems. But virtually everywhere I traveled in my reporting, covering far- off despots, wars, and upheavals, strange and extreme events kept getting linked back to social media. A sudden riot, a radical new group, widespread belief in some oddball conspiracy -- all had a common link.
Page 5
the more incendiary the post, they sensed, the more widely the platforms spread it.
Page 7
Many at the company seemed almost unaware that the platform's algorithms and design deliberately shape users' experiences and incentives, and therefore the users themselves.
Page 8
Within Facebook's muraled walls, though, belief in the product as a force for good seemed unshakable.
Page 9
attraction to divisiveness," the researchers warned in a 2018 presentation later leaked to the Wall Street Journal. In fact, the presentation continued, Facebook's systems were designed in a way that delivered users 'more and more divisive content in an effort to gain user attention & increase time on the platform."
Page 9
Public figures routinely referred to the companies as one of the gravest threats of our time. In response, the companies' leaders pledged to confront the harms flowing from their services.
Page 10
They unveiled election- integrity war rooms and updated content- review policies. But their business model -- keeping people glued to their platforms as many hours a day as possible -- and the underlying technology deployed to achieve this goal remained largely unchanged.
Page 10
commissioned by the company under pressure from civil rights groups, concluded that the platform was everything its executives had insisted to me it was not. Its policies permitted rampant misinformation that could undermine elections. Its algorithms and recommendation systems were 'driving people toward self- reinforcing echo chambers of extremism," training them to hate. Perhaps most damning, the report concluded that the company did not understand how its own products affected its billions of users.
Page 11
The early conventional wisdom, that social media promotes sensationalism and outrage, while accurate, turned out to drastically understate things.
Page 11
This technology exerts such a powerful pull on our psychology and our identity, and is so pervasive in our lives, that it changes how we think, behave, and relate to one another. The effect, multiplied across billions of users, has been to change society itself.
Page 12
With little incentive for the social media giants to confront the human cost to their empires -- a cost borne by everyone else, like a town downstream from a factory pumping toxic sludge into its communal well -- it would be up to dozens of alarmed outsiders and Silicon Valley defectors to do it for them.
One: Trapped in the Casino
Page 14
'If you joined the one anti- vaccine group," she said, 'it was transformative." Nearly every vaccine- related recommendation promoted to her was for anti- vaccine content. 'The recommendation engine would push them and push them and push them."
Page 15
Before long, the system prompted her to consider joining groups for unrelated conspiracies. Chemtrails. Flat Earth.
Page 15
The reason the system pushed the conspiratorial outliers so hard, she came to realize, was engagement. Social media platforms surfaced whatever content their automated systems had concluded would maximize users' activity online, thereby allowing the company to sell more ads.
Page 16
Facebook wasn't just indulging anti- vaccine extremists. It was creating them.
Page 16
Almost certainly, no one at Facebook or YouTube wanted to promote vaccine denial.
Page 17
But the technology building this fringe movement was driven by something even the company's CEO could not overcome: the cultural and financial mores at the core of his entire industry.
Page 20
As semiconductors developed into the circuit board, then the computer, then the internet, and then social media, each technology produced a handful of breakout stars, who in turn funded and guided the next handful.
Page 22
human instincts to conform run deep. When people think something has become a matter of consensus, psychologists have found, they tend not only to go along, but to internalize that sentiment as their own.
Page 22
the outrage was being ginned up by the very Facebook product that users were railing against.
Page 22
That digital amplification had tricked Facebook's users, and even its leadership, into misperceiving the platform's loudest voices as representing everyone, growing a flicker of anger into a wildfire.
Page 23
But, crucially, it had also done something else: driven engagement up. Way up.
Page 24
Long after their technology's potential for harm had been made clear, the companies would claim to merely serve, and never shape or manipulate, their users' desires. But manipulation had been built into the products from the beginning.
Page 24
'The thought process that went into building these applications," Parker told the media conference, 'was all about, 'How do we consume as much of your time and conscious attention as possible?'"
Page 24
'We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever. And that's going to get you to contribute more content, and that's going to get you more likes and comments."
Page 25
the 'social- validation feedback loop,"
Page 25
The term of art is 'persuasion": training consumers to alter their behavior in ways that serve the bottom line. Stanford University had operated a Persuasive Tech Lab since 1997. In 2007, a single semester's worth of student projects generated $ 1 million in advertising revenue.
Page 26
Dopamine is social media's accomplice inside your brain. It's why your smartphone looks and feels like a slot machine, pulsing with colorful notification badges, whoosh sounds, and gentle vibrations.
Page 26
Social apps hijack a compulsion -- a need to connect -- that can be even more powerful than hunger or greed.
Page 26
intermittent variable reinforcement.
Page 27
Never knowing the outcome makes it harder to stop pulling the lever. Intermittent variable reinforcement is a defining feature of not only gambling and addiction but also, tellingly, abusive relationships.
Page 27
while posting to social media can feel like a genuine interaction between you and an audience, there is one crucial, invisible difference. Online, the platform acts as unseen intermediary.
Page 27
It decides which of your comments to distribute to whom, and in what context.
Page 27
The average American checks their smartphone 150 times per day, often to open social media.
Page 28
YEAR AFTER launching the news feed, a group of Facebook developers mocked up something they called the 'awesome button" -- a one- click expression of approval for another user's post.
Page 28
After a year and a half in limbo, a new team took over what was now the 'Like" button.
Page 29
That little button's appeal, and much of social media's power, comes from exploiting something called the sociometer.
Page 29
The anguish we feel from low self- esteem is wholly self- generated.
Page 29
self- esteem is in fact 'a psychological gauge of the degree to which people perceive that they are relationally valued and socially accepted by other people."
Page 29
It's what the anthropologist Brian Hare called 'survival of the friendliest." The result was the development of a sociometer: a tendency to unconsciously monitor how other people in our community seem to perceive us.
Page 30
the platforms added a powerful twist: a counter at the bottom of each post indicating the number of likes, retweets, or upvotes it had received -- a running quantification of social approval for each and every statement.
Page 30
When we receive a Like, neural activity flares in a part of the brain called the nucleus accumbens: the region that activates dopamine.
Page 31
Expressing identity, sharpening identity, seeing and defining the world through its lens. This effect remade how social media works, as its overseers and automated systems drifted toward the all- consuming focus on identity that best served their agendas.
Page 32
Our drive to cultivate a shared identity is so powerful that we'll construct one even out of nothing.
Page 33
Prejudice and hostility have always animated this instinct. Hunter- gatherer tribes sometimes competed for resources or territory.
Page 33
Social media's indulgence of identity wasn't obviously harmful at first. But it was always well known.
Page 34
In 2014, I was one of several Washington Post reporters to start Vox, a news site intended to leverage the web. We never shaped our journalism to please social media algorithms -- at least, not consciously -- but headlines were devised with them in mind. The most effective approach, though one that in retrospect we should have perhaps been warier of using, was identity conflict. Liberals versus conservatives. The righteousness of anti- racism. The outrageousness of lax gun laws.
Page 34
'Few realized, early on, that the way to win the war for attention was to harness the power of community to create identity.
Two: Everything Is Gamergate
Page 47
Raucous debate became seen as the purest meritocracy: if you couldn't handle your own or win over the crowd, if you felt harassed or unwelcome, it was because your ideas had not prevailed on merit.
Page 49
Peter Thiel, a founder of PayPal and the first outside investor in Facebook, had urged elevating antisocial contrarians.
Page 49
'If you're less sensitive to social cues, you're less likely to do the same things as everyone else around you."
Page 49
'There's not a lot of value placed on social niceties," Margaret O'Mara told me. 'There's a tolerance for weirdness, in part because weird people have a proven track record. That's the other dimension of Silicon Valley culture. It's like everyone was an asshole."
Page 50
Thiel, further parlaying his PayPal success, started a fund that launched major investments in Airbnb, Lyft, and Spotify. Throughout, like many leading investors, he imposed his ideals on the companies he oversaw.
Page 51
with the advent of the social media era, the industry was building its worst habits into companies that then smuggled those excesses -- chauvinism, a culture of harassment, majoritarianism disguised as meritocracy -- into the homes and minds of billions of consumers.
Page 51
the norms and values that they'd encoded into the early web turned out to guide its millions of early adopters toward something very different than the egalitarian utopia they'd imagined.
Page 52
4chan
Page 52
Anytime a user wanted to start a new thread, they had to upload an image, which kept the platform filled with user- made memes and cartoons.
Page 52
Long before Snapchat and others borrowed the feature, discussions automatically deleted after a brief period, which enabled unseemly behavior that might've been shunned elsewhere. So did the site's anonymity; nearly all posts are marked as written by 'Anonymous," which instills an anything- goes culture and a sense of collective identity that can be alluring, especially to people who crave a sense of belonging.
Page 54
'Ultimately," Christopher Poole, 4chan's founder, said in 2008, 'the power lies in the community to dictate its own standards."
Page 54
The internet's promise of total freedom appealed especially to kids, for whom off- line life is ruled by parents and teachers. Adolescents also have a stronger drive to socialize than adults, which manifests as heavier use of social networks and a greater sensitivity to what happens there. Poole had started 4chan when he was just fifteen. Kids who felt isolated off- line, like Adam, drove an outsized share of online activity, bringing the concerns of the disempowered and the bullied with them.
Page 55
Transgressing ever- greater taboos -- even against cruelty to grieving parents -- became a way to signal that you were in on the joke. 'When you browse 4chan and 8chan while the rest of your friends are posting normie live- laugh- love shit on Instagram and Facebook," Adam said, 'you feel different. Cooler. Part of something niche."
Page 55
These two unifying activities, flaunting taboos and pulling pranks, converged to become trolling.
Page 55
The thrill of getting a reaction out of someone even had a name: lulz, a corruption of the acronym for 'laugh out loud."
Page 56
Unchastened by the social constraints of the off- line world, each user operates like a miniature Facebook algorithm, iteratively learning what best wins others' attention.
Page 56
One lesson consistently holds. To rise among tens of thousands of voices, regardless of what you post, it is better to amp up the volume, to be more extreme.
Page 57
'Trolling is basically internet eugenics,"
Page 59
From the beginning, social media platforms borrowed heavily from video games. Notifications are delivered in stylized 'badges," which Gordon told the audience could double a user's time on site, while likes mimic a running score. This was more than aesthetic. Many platforms initially considered gamers -- tech obsessives who would surely pump hours into this digital interface, too -- to be a core market.
Page 60
New TV programming like My Little Pony and GI Joe delivered hyper- exaggerated gender norms, hijacking adolescents' natural gender self- discovery and converting it into a desire for molded plastic products.
Page 60
Tapping into our deepest psychological needs, then training us to pursue them through commercial consumption that will leave us unfulfilled and coming back for more, has been central to American capitalism since the postwar boom.
Page 60
Marketers, having long positioned games as childhood toys, kept boys hooked through adolescence and adulthood with -- what else? -- sex.
Page 62
Senator Trent Lott of Mississippi. His staff had deployed a now- famous push poll: 'Do you believe Democrats are trying to take away your culture?" It performed spectacularly, especially with white men.
Page 63
Facebook, in the hopes of boosting engagement, began experimenting with breaking the so- called Dunbar limit. The British anthropologist Robin Dunbar had proposed, in the 1990s, that humans are cognitively capped at maintaining about 150 relationships.
Page 63
Our behavior changes, too, seeking to reset back to 150, like a circuit breaker tripping. Even online, people converged naturally on Dunbar's number.
Page 63
Users were pushed toward content from what Facebook called 'weak ties": friends of friends, contacts of contacts, cousins of cousins.
Page 63
Enforced through algorithmic sophistication, the scheme worked. Facebook pulled users into ever expanding circles of half- strangers, surpassing the Dunbar limit.
Page 64
But studies of rhesus monkeys and macaques, whose Dunbar- like limits are thought to mirror our own, had found that pushing them into larger groups made them more aggressive, more distrusting, and more violent.
Page 64
The monkeys seemed to sense that safely navigating an unnaturally large group was beyond their abilities, triggering a social fight- or- flight response that never quite turned off. They also seemed to become more focused on forming and enforcing social hierarchies, likely as a kind of defense mechanism.
Page 64
Facebook could push you into groups -- stand- alone discussion pages focused on some topic or interest -- ten times that size.
Page 65
'There's this conspiracy- correlation effect," DiResta said, 'in which the platform recognizes that somebody who's interested in conspiracy A is typically likely to be interested in conspiracy B, and pops it up to them."
Page 65
'I called it radicalization via the recommendation engine," she said. 'By having engagement- driven metrics, you created a world in which rage- filled content would become the norm."
Page 65
The algorithmic logic was sound, even brilliant. Radicalization is an obsessive, life- consuming process. Believers come back again and again, their obsession becoming an identity, with social media platforms the center of their day- to- day lives.
Page 65
She had seen it over and over. Recruits were drawn together by some ostensibly life- or- death threat: the terrible truth of vaccines, the Illuminati agents who spread Zika, the feminists seeking to overturn men's rightful place atop the gender hierarchy, starting with gaming.
Three: Opening the Portal
Page 68
Still, Reddit was built and governed around the same early internet ideals as 4chan, and had absorbed that platform's users and cultural tics.
Page 68
Its up- or- down voting enforced an eclipsing majoritarianism that pushed things even further.
Page 68
upvote counts are publicly displayed, tapping into users' sociometer- driven impulse for validation. The dopamine- chase glued users to the site and, as on Facebook, steered their actions.
Page 68
As of 2016, four years after her suit, still only 11 percent of technology venture- capital partners were women. Two percent were Black.
Page 68
looked like them: in 2018, 98 percent of their investment dollars went to male- led companies.
Page 70
'Every Man Is Responsible for His Own Soul." This would become a standard defense from social media overlords: that the importance of their revolution compelled them to disregard the petty laws and morals of the outmoded off- line world. Besides, any bad behavior was users' fault, no matter how crucial a role the platform played in enabling, encouraging, and profiting from those transgressions.
Page 71
Finally, nearly three weeks after the photos first appeared, Wong banned them. Reddit's users, incensed, accused the platform of selling out its principles to shadowy corporate influence and, worse, feminists.
Page 72
Pao was also testing a theory: that the most hateful voices, though few in number, exploited social media's tendency to amplify extreme content for its attention- winning power, tingeing the entire platform in the process.
Page 73
The first ban was small: a subreddit called 'FatPeopleHate."
Page 73
Still, Reddit's userbase erupted in anger at the removals as an attack on the freedom to offend and transgress that, after all, had been an explicit promise of the social web since its founding.
Page 74
'The trolls are winning," Pao wrote in a Washington Post op- ed a few days later. The internet's foundational ideals, while noble, had led tech companies to embrace a narrow and extreme interpretation of free speech that was proving dangerous, she warned. She had lasted just eight months.
Page 75
MILO YIANNOPOULOS,
Page 75
Headlines like 'Lying Greedy Promiscuous Feminist Bullies Are Tearing the Video Game Industry Apart" went viral on those platforms as seeming confirmation.
Page 75
His bosses had hoped his articles would inform Breitbart's small, far- right readership on tech issues. Instead, they tapped into a new and much larger audience that they hadn't even known existed -- one that was only coming together at that moment. 'Every time you write one of your commentaries, it gets 10,000 comments," Steve Bannon, Breitbart's chief, told Yiannopoulos on the site's radio show. 'It goes even broader than the Breitbart audience, all over."
Page 76
Within three years, the angry little subculture Yiannopoulos championed would evolve into a mainstream movement so powerful that he was granted a keynote slot at the Conservative Political Action Conference, the most important event on the political right.
Page 76
(The invitation was later revoked.) Bannon called their cause the 'alt right," a term borrowed from white- power extremists who'd hoped to rebrand for a new generation.
Page 76
Bannon and others on the alt right saw a chance to finally break through. 'I realized Milo could connect with these kids right away," Bannon said later. 'You can activate that army. They come in through Gamergate or whatever and then get turned on to politics and Trump."
Page 77
'They call it 'meme magic' -- when previously obscure web memes become so influential they start to affect real- world events," Yiannopoulos wrote that summer before the election.
Page 77
The movement coalesced around Trump, who had converged on the same tics and tactics as Yiannopoulos and other Gamergate stars, and for seemingly the same reason: it's what social media rewarded.
Page 78
He swung misinformation and misogyny as weapons. He trolled without shame, heaping victims with mockery and abuse. He dared society's gatekeepers to take offense at flamboyant provocations that were right off 4chan.
Page 78
From May 2015, a month before Trump declared his candidacy, to November 2016, a Harvard study later found, the most popular right- wing news source on Facebook was Breitbart, edging out even Fox News.
Page 78
Awed outsiders would long ascribe Breitbart's rise to dark- arts social media manipulation. In truth, the publication did little more than post its articles to Facebook and Twitter, just as it always had. It was, in many ways, a passive beneficiary. Facebook's systems were promoting a host of once- obscure hyperpartisan blogs and outright misinformation shops -- bearing names like The Gateway Pundit, Infowars, The Conservative Treehouse, and Young Cons -- into mega- publishers with the power to reshape reality for huge segments of the population.
Page 80
'This cycle of aggrievement and resentment and identity, and mob anger, it feels like it's consuming and poisoning the entire nation."
Four: Tyranny of Cousins
Page 85
'We enjoy being outraged. We respond to it as a reward."
Page 85
The platforms had learned to indulge the outrage that brought their users 'a rush -- of purpose, of moral clarity, of social solidarity."
Page 85
The growing pace of these all- consuming meltdowns, perhaps one a week, indicated that social media was not just influencing the broader culture, but, to some extent, supplanting
Page 87
Popular culture often portrays morality as emerging from our most high-minded selves: the better angels of our nature, the enlightened mind. Sentimentalism says it is actually motivated by social impulses like conformity and reputation management (remember the sociometer?), which we experience as emotion.
Page 87
the emotional brain works fast, often resolving to a decision before conscious reason even has a chance to kick in.
Page 87
social purpose, like seeking peers' approval, rewarding a Good Samaritan, or punishing a transgressor. But the instinctual nature of that behavior leaves it open to manipulation. Which is exactly what despots, extremists, and propagandists have learned to do, rallying people to their side by triggering outrage -- often at some scapegoat or imagined wrongdoer. What would happen when, inevitably, social platforms learned to do the same?
Page 89
Much legal scholarship, Klonick knew, considers public shaming necessary for society to function: tut-tutting someone for cutting in line, shunning them for a sexist comment, getting them fired for joining a hate group. But social media was changing the way that public shaming worked, which would necessarily change the functioning of society itself. 'Low cost, anonymous, instant, and ubiquitous access to the internet has removed most -- if not all -- of the natural checks on shaming," she wrote of her findings, 'and thus changed the way we perceive and enforce social norms."
Page 92
Truth or falsity has little bearing on a post's reception, except to the extent that a liar is freer to alter facts to conform to a button-pushing narrative. What matters is whether the post can provoke a powerful reaction, usually outrage.
Page 92
A 2013 study of the Chinese platform Weibo found that anger consistently travels further than other sentiments.
Page 93
Right or left, the common variable was always social media, the incentives it imposes, the behavior it elicits.
Page 93
Our social sensitivity evolved for tribes where angering a few dozen comrades could mean a real risk of death. On social media, one person can, with little warning, face the fury and condemnation of thousands.
Page 97
pleasurable. Brain scans find that, when subjects harm someone they believe is a moral wrongdoer, their dopamine-reward centers activate.
Page 97
From behind a screen, far from our victims, there is no pang of guilt at seeing pain on the face of someone we've harmed. Nor is there shame at realizing that our anger has visibly crossed into cruelty.
Page 98
the platform's extreme bias toward outrage meant that misinformation prevailed, which created demand for more outrage-affirming rumors and lies.
Page 99
scales; people express more outrage, and demonstrate more willingness to punish the undeserving, when they think their audience is larger.
Page 101
algorithmically encouraged rage.
Five: Awakening the Machine
Page 106
'In September 2011, I sent a provocative email to my boss and the YouTube leadership team," Goodrow later wrote. 'Subject line: 'Watch time, and only watch time.' It was a call to rethink how we measured success."
Page 106
second. 'Our job was to keep people engaged and hanging out with us,"
Page 108
YouTube's system seeks something more far-reaching than a monthly subscription fee. Its all-seeing eye tracks every detail of what you watch, how long you watch it, what you click on next. It monitors this across two billion users, accruing what is surely the largest dataset on viewer preferences ever assembled, which it constantly scans for patterns. Chaslot and others tweaked the system as it went, nudging its learning process to better accomplish its goal: maximum watch time.
Page 109
One of the algorithm's most powerful tools is topical affinity. If you watch a cat video all the way through, Chaslot explained, YouTube will show you more on return visits.
Page 109
The effect is to pull users toward ever more titillating variations on their interests.
Page 115
Focus everything, he instructed, on maximizing a few quantifiable metrics.
Page 115
Concentrate power in the hands of engineers who can do it. And shunt aside the rest.
Page 116
They were chasing a very specific model: free-to-use web services that promised breakneck user growth.
Page 117
in the late 2000s, Amazon and a few others set up sprawling server farms, putting their processing power and data storage up for rent, calling it 'the cloud." Now you no longer needed to invest in overhead. You rented it from Amazon, uploading your website to their servers.
Page 117
'Forget strategy," the investor Roger McNamee wrote of this new approach. 'Pull together a few friends, make a product you like, and try it in the market. Make mistakes, fix them, repeat." It was transformative for investors, too, who no longer had to sink millions into getting a startup to market. They could do it for pocket change.
Page 119
If the value of an ad impression kept shrinking, even the Facebooks and YouTubes might cease to be viable. Their only choice was to permanently grow the number of users, and those users' time on site, many times faster than those same actions drove down the price of an ad. But controlling the market of human attention, as their business models had fated them to attempt, was beyond anything a man-made program could accomplish.
Page 120
Wojcicki's YouTube existed to convert eyeballs into money. Democracy and social cohesion were somebody else's problem.
Page 120
'So, when YouTube claims they can't really say why the algorithm does what it does, they probably mean that very literally."
Page 120
The average user's time on the platform skyrocketed. The company estimated that 70 percent of its time on site, an astronomical share of its business, was the result of videos pushed by its algorithm-run recommendation system.
Page 121
In 2014, the same year that Wojcicki took over YouTube, Facebook's algorithm replaced its preference for Upworthy-style clickbait with something even more magnetic: emotionally engaging interactions. Across the second half of that year, as the company gradually retooled its systems, the platform's in-house researchers tracked 10 million users to understand the effects. They found that the changes artificially inflated the amount of pro-liberal content that liberal users saw and the amount of pro-conservative content that conservatives saw. Just as Pariser had warned. The result, even if nobody at Facebook had consciously intended as much, was algorithmically ingrained hyperpartisanship.
Page 121
The process, Facebook researchers put it, somewhat gingerly, in an implied warning that the company did not heed, was 'associated with adopting more extreme attitudes over time and misperceiving facts about current events."
Page 123
TikTok, a Chinese-made app, shows each user a stream of videos selected almost entirely by algorithms. Its A.I. is so sophisticated that TikTok almost immediately attracted 80 million American users, who often use it for hours at a time, despite most of its engineers not speaking English or understanding American culture.
Page 124
Like DiResta's anti-vaxxers, or even Upworthy, the Russians hijacked the algorithm's own preferences. It wasn't just that the agents repeated phrases or behaviors that performed well. Their apparent mission, of stirring up political discord, seemed to naturally align with what the algorithms favored anyway, often to extremes.
Page 125
'He was telling me, 'Oh, but there are so many videos, it has to be true,'" Chaslot said. 'What convinced him was not the individual videos, it was the repetition. And the repetition came from the recommendation engine."
Page 125
illusory truth effect. We are, every hour of every day, bombarded with information. To cope, we take mental shortcuts to quickly decide what to accept or reject.
Page 125
One is familiarity; if a claim feels like something we've accepted as true before, it probably still
Page 125
When he searched YouTube for Pope Francis, for instance, 10 percent of the videos it displayed were conspiracies. On global warming, it was 15 percent. But the real shock came when Chaslot followed algorithmic recommendations for what to watch next, which YouTube has said accounts for most of its watch time. A staggering 85 percent of recommended videos on Pope Francis were conspiracies, asserting Francis's 'true" identity or purporting to expose Satanic plots at the Vatican.
Page 128
But the influence of algorithms only deepened, including at the last holdout, Twitter. For years, the service had shown each user a simple, chronological feed of their friends' tweets. Until, in 2016, it introduced an algorithm that sorted posts -- for engagement, of course, and to predictable effect.
Page 128
'The recommendation engine appears to reward inflammatory language and outlandish claims."
Page 129
Shortly after Twitter algorithmified, Microsoft launched an A.I.-run Twitter account called Tay. The bot operated, like the platforms, on machine learning, though with a narrower goal: to converse convincingly with humans by learning from each exchange.
Page 129
Within twenty-four hours, Tay's tweets had taken a disturbing turn.
Page 129
'You absolutely do NOT let an algorithm mindlessly devour a whole bunch of data that you haven't vetted even a little bit."
Six: The Fun House Mirror
Page 132
Conspiracy belief is highly associated with 'anomie," the feeling of being disconnected from society.
Page 136
It was undeniable that Trump owed his rise to nondigital factors, too: the institutional breakdown of the Republican Party, a decades-long rise in polarization and public distrust, white backlash to social change, a radicalized right-wing electorate. Social media had created none of these. But, in time, a network of analysts and whistleblowers would prove that it had exacerbated them all, in some cases drastically.
Page 138
moral outrage can become infectious in groups, and that it can alter the mores and behaviors of people exposed to it.
Page 138
across topics, across political factions, what psychologists refer to as 'moral-emotional words" consistently boosted any tweet's reach.
Page 138
Moral-emotional words convey feelings like disgust, shame, or gratitude.
Page 138
calls for, communal judgment,
Page 138
That makes these words different from either narrowly emotional sentiments ('Overjoyed at today's marriage equality ruling") or purely moral ones ('The president is a liar"), for which Brady's effect didn't appear. Tweets with moral-emotional words, he found, traveled 20 percent farther -- for each moral-emotional word.
Page 139
Brady found something else. When a liberal posted a tweet with moral-emotional words, its reach substantially increased among other liberals, but declined with conservatives. (And vice versa.) It won the user more overall attention and validation, in other words, at the cost of alienating people from the opposing side. Proof that Twitter encouraged polarization.
Page 143
They were acting on a widely held misinterpretation of something known as contact theory. Coined after World War II to explain why desegregated troops became less prone to racism, the theory suggested that social contact led distrustful groups to humanize one another. But subsequent research has shown that this process works only under narrow circumstances: managed exposure, equality of treatment, neutral territory, and a shared task. Simply mashing hostile tribes together, researchers repeatedly found, worsens animosity.
Page 143
People, as a rule, perceive out-groups as monoliths.
Page 144
Even in its most rudimentary form, the very structure of social media encourages polarization. Reading an article and then the comments field beneath it, an experiment found, leads people to develop more extreme views on the subject in the article.
Page 144
Control groups that read the article with no comments became more moderate and open-minded.
Page 144
News readers, the researchers discovered, process information differently when they are in a social environment: social instincts overwhelm reason, leading them to look for affirmation of their side's righteousness.
Page 148
The data revealed, as much as any foreign plot, the ways that the Valley's products had amplified the reach, and exacerbated the impact, of malign influence. (She later termed this 'ampliganda," a sort of propaganda whose power comes from its propagation by masses of often unwitting people.)
Page 149
Over many iterations, the Russians settled on a strategy. Appeal to people's group identity. Tell them that identity was under attack. Whip up outrage against an out- group. And deploy as much moral- emotional language as possible.
Page 152
the internet offered political outsiders a way around the mainstream outlets that shunned them. As those candidates' grassroots supporters spent disproportionate time on YouTube, the system learned to push users to those videos, creating more fans, driving up watch time further. But thanks to the preferences of the algorithms for extreme and divisive content, it was mostly fringe radicals who benefited, and not candidates across the spectrum.
Page 154
The platforms, they concluded, were reshaping not just online behavior but underlying social impulses, and not just individually but collectively, potentially altering the nature of 'civic engagement and activism, political polarization, propaganda and disinformation." They called it the MAD model, for the three forces rewiring people's minds.
Page 154
Motivation: the instincts and habits hijacked by the mechanics of social media platforms. Attention: users' focus manipulated to distort their perceptions of social cues and mores. Design: platforms that had been constructed in ways that train and incentivize certain behaviors.
Page 156
As psychologists have known since Pavlov, when you are repeatedly rewarded for a behavior, you learn a compulsion to repeat it. As you are trained to turn all discussions into matters of high outrage, to express disgust with out-groups, to assert the superiority of your in-group, you will eventually shift from doing it for external rewards to doing it simply because you want to do it. The drive comes from within. Your nature has been changed.
Page 156
The second experiment demonstrated that the attention economy, by tricking users into believing that their community held more extreme and divisive views than it really did, had the same effect. Showing subjects lots of social media posts from peers that expressed outrage made them more outrage-prone themselves.
Page 156
It was a chilling demonstration of how portraying people and events in sharply moral-emotional terms brings out audiences' instincts for hatred and violence -- which is, after all, exactly what social platforms do, on a billions-strong scale, every minute of every day.
Eight: Church Bells
Page 181
the rumors activated a sense of collective peril in groups that were dominant but felt their status was at risk -- majorities angry and fearful over change that threatened to erode their position in the hierarchy.
Page 181
status threat. When members of a dominant social group feel at risk of losing their position, it can spark a ferocious reaction. They grow nostalgic for a past, real or imagined, when they felt secure in their dominance ('Make America Great Again").
Page 182
We don't just become more tribal, we lose our sense of self. It's an environment, they wrote, 'ripe for the psychological state of deindividuation."
Page 182
surrendering part of your will to that of the group.
Page 182
deindividuation, with its power to override individual judgment, and status threat, which can trigger collective aggression on a terrible scale.
Page 185
Anti-refugee sentiment is among the purest expressions of status threat, combining fear of demographic change with racial tribalism.
Page 186
There's a term for the process Pauli described, of online jokes gradually internalized as sincere. It's called irony poisoning. Heavy social media users often call themselves 'irony poisoned," a joke on the dulling of the senses that comes from a lifetime engrossed in social media subcultures, where ironic detachment, algorithmic overstimulation, and dare-to-offend humor prevail.
Page 186
Desensitization makes the ideas seem less taboo or extreme, which in turn makes them easier to adopt.
Page 188
defining traits and tics of superposters, mapped out in a series of psychological studies, are broadly negative. One is dogmatism: 'relatively unchangeable, unjustified certainty." Dogmatics tend to be narrow-minded, pushy, and loud. Another: grandiose narcissism, defined by feelings of innate superiority and entitlement.
Page 189
Narcissists are consumed by cravings for admiration and belonging, which makes social media's instant feedback and large audiences all but irresistible.
Nine: The Rabbit Hole
Page 203
'Really the only place where they could exchange their thoughts and coalesce and find allies was online." These groups didn't reflect real-world communities of any significant size, he realized. They were native to the web -- and, as a result, shaped by the digital spaces that had nurtured them. Climate skeptics largely gathered in the comments sections of newspapers and blogs. There, disparate contrarians and conspiracists, people with no shared background beyond a desire to register their objection to climate coverage got clumped together. It created a sense of common purpose.
Page 208
By January 2018, Kaiser was mounting enough evidence to begin slowly going public. He told a Harvard seminar that the coalescing far right of which the Charlottesville gathering was a part was 'not done by users," he was coming to believe, at least not entirely, but had been in part 'created through the YouTube algorithm."
Page 209
Canadian psychology professor. In 2013, Peterson began posting videos addressing, amid esoteric Jungian philosophy, youth male distress.
Page 209
YouTube searches for 'depression" or certain self-help keywords often led to Peterson. His videos' unusual length, sixty minutes or more, align with the algorithm's drive to maximize watch time. So does his college-syllabus method of serializing his argument over weeks, which requires returning for the next lecture and the next.
Page 209
Michael Kimmel calls 'aggrieved entitlement."
Page 210
YouTube's algorithm, in many cases, tapped into that discontent, recommending channels that took Peterson's message to greater and greater extremes.
Page 210
Users who comment on Peterson's videos subsequently become twice as likely to pop up in the comments of extreme-right YouTube channels, a Princeton study found.
Page 210
the algorithm makes the connection.
Page 210
The scholar J. M. Berger calls it 'the crisis-solution construct." When people feel destabilized, they often reach for a strong group identity to regain a sense of control.
Page 211
Incel forums had begun as places to share stories about feeling lonely. Users discussed how to cope with living 'hugless." But the norms of social media one-upmanship, of attention chasing, still prevailed. The loudest voices rose.
Page 212
By 2021, fifty killings had been claimed by self-described incels, a wave of terrorist violence.
Page 212
The movement was a fringe of a fringe, dwarfed by Pizzagate or the alt right. But it hinted at social media's potential to galvanize young white male anomie into whole communities of extremism -- an increasingly widespread phenomenon.
Page 214
These channels were her 'YouTube friends," salve for a lost marriage and feelings of isolation. They were community. They were identity.
Page 214
YouTube's system, they found, did three things uncannily well.
Page 214
it stitched together wholly original clusters of channels.
Page 214
There was nothing innate connecting these beyond the A.I.'s conclusion that showing them alongside one another would keep users watching.
Page 215
YouTube's recommendations generally moved toward the more extreme end of whatever network the user was in.
Page 215
the third discovery.
Page 215
the system's recommendations were clustering mainstream right-wing channels, and even some news channels, with many of the platform's most virulent hatemongers, incels, and conspiracy theorists.
Page 216
One channel sat conspicuously in the network's center, a black hole toward which YouTube's algorithmic gravity pulled: Alex Jones.
Page 216
Rauchfleisch warned, 'Being a conservative on YouTube means that you're only one or two clicks away from extreme far-right channels, conspiracy theories, and radicalizing content."
Page 219
Jack Dorsey, Twitter's CEO,
Page 219
'We didn't fully predict or understand the real-world negative consequences" of launching an 'instant, public, global" platform, he wrote that March. He conceded that it had resulted in real harms. He began, in interviews, voluntarily raising heretical ideas that other tech CEOs continued to fervently reject: maximizing for engagement is dangerous; likes and retweets encourage polarization. The company, he said, would reengineer its systems to promote 'healthy" conversations rather than engaging ones. He hired prominent experts and research groups to develop new features or design elements to do it.
Page 220
dangerous. 'They told me, 'People click on Flat Earth videos, so they want a Flat Earth video,'" he recalled. 'And my point was, no, it's not that because someone clicked on the Flat Earth video, he wants to be lied to. He is just curious, and there is a clickbait title. But to the algorithm, when you watch a video, it means you endorse it."
Page 221
YouTube, by showing users many videos in a row all echoing the same thing, hammers especially hard at two of our cognitive weak points -- that repeated exposure to a claim, as well as the impression that the claim is widely accepted, each make it feel truer than we would otherwise judge it to be.
Page 221
The post went on for twenty more lines. References just cryptic enough that users could feel like they were cracking a secret code, and obvious enough to ensure that they would.
Page 222
Followers got more than a story. QAnon, as the movement called itself, became a series of online communities where believers gathered to parse Q's posts.
Page 222
Extremist groups have long recruited on a promise to fulfill adherents' need for purpose and belonging.
Page 222
Conspiracies insist that events, rather than uncontrollable or impersonal, are all part of a hidden plot whose secrets you can unlock. Reframing chaos as order, telling believers they alone hold the truth, restores their sense of autonomy and control. It's why QAnon adherents often repeat to one another their soothing mantra: 'Trust the plan."
Page 224
But for all the feelings of autonomy, security, and community that QAnon offered, it came at a cost: crushing isolation.
Page 225
It was one of the things that made QAnon so radicalizing. Joining often worsened the very sense of isolation and being adrift that had led people to it in the first place. With nowhere else to turn and now doubly needful of reassurance, followers gave themselves over to the cause even more fully.
Page 225
Renée DiResta's
Page 225
'I can't emphasize enough what a disaster Groups are," she tweeted in 2018, as evidence mounted. 'The Groups recommendation engine is a conspiracy correlation matrix. It pushes people prone to extremist & polarizing content into closed and then secret groups. FB has no idea what it's built here."
Page 226
In time, much as 4chan's transgressiveness became an in-group shibboleth, so did desensitization on 8chan. Tolerating things too shocking or unbearable for outsiders was a way to prove you belonged.
Page 231
The investigators, citing interviews and forensic reconstructions of his web history, concluded that 'YouTube was, for him, a far more significant source of information and inspiration" than any other platform had been.
Ten: The New Overlords
Page 240
THERE IS SO much that makes social media techno-governance peculiar. The hubris of both its scale and its secrecy. The belief that politics and social relations are engineering problems. The faith in engineers to solve them. The naivete in thinking that they had done so, or at least enough to keep expanding.
Page 243
The politics of the PayPal founders leaned severely libertarian: they were socially Darwinian, distrustful of government, certain that business knew best. Thiel took this to such extremes that in 2009, he announced, 'I no longer believe that freedom and democracy are compatible." Society could no longer be trusted to 'the unthinking demos that guides so-called social democracy," he wrote, using the Greek term for citizens. Only 'companies like Facebook" could safeguard liberty.
Page 244
DiResta said. It was about algorithmic amplification, online incentives that led unwitting users to spread propaganda, and the ease with which bad actors could 'leverage the entire information ecosystem to manufacture the appearance of popular consensus."
Page 247
The changes were dramatic. People who deleted Facebook became happier, more satisfied with their life, and less anxious. The emotional change was equivalent to 25 to 40 percent of the effect of going to therapy -- a stunning drop for a four-week break.
Page 247
Facebook quitters also spent 15 percent less time consuming the news.
Page 248
Even Silicon Valley was beginning to internalize the backlash. An internal poll of 29,000 Facebook employees taken that October found that the share of employees who said they were proud to work at Facebook had declined from 87 to 70 percent in just a year.
Page 249
in practice, social media did not abolish establishments so much as replace them. Its algorithms and incentives now acted as gatekeepers, determining who rose or fell.
Page 249
From the beginning, the Yellow Vests, as they termed themselves, identified as a leaderless, radically horizontal movement. Social media had, unquestionably, enabled this.
Page 251
Without the underlying infrastructure, social media movements are less able to organize coherent demands, coordinate, or act strategically.
Page 251
And by channeling popular energy away from the harder kind of organizing, it preempts traditional movements from emerging.
Page 260
Yaël Eisenstat,
Page 260
claimed to have watched company policymakers work hard to balance democratic integrity with Facebook's mission, only to be overruled by 'the few voices who ultimately decided the company's overall direction." Facebook, she warned, was failing 'the biggest test of whether it will ever truly put society and democracy ahead of profit and ideology."
Page 262
came to think of Facebook's policy team as akin to Philip Morris scientists tasked with developing a safer, better filter. In one sense, cutting down the carcinogens ingested by billions of smokers worldwide saved or prolonged lives on a scale few of us could ever match. In another sense, those scientists were working for the cigarette company, advancing the cause of selling cigarettes that harmed people at an enormous scale.
Page 264
Stanford's Persuasive Tech Lab, where academics and engineers teamed up to develop maximally addictive services, renamed itself the 'Behavior Design Lab." Its chief tweeted, 'We will start to realize that being chained to your mobile phone is a low-status behavior, similar to smoking." Nir Eyal, the consultant who'd pioneered slot machines as the model for social media platforms, pivoted from screen-time-maximization guru to screen-time-reduction guru, publishing a book with the title Indistractable.
Page 265
This was the real governance problem, I came to believe. If it was taboo to consider that social media itself, like cigarettes, might be causing the harms that seemed to consistently follow its adoption, then employees tasked with managing those harms were impossibly constrained.
Eleven: Dictatorship of the Like
Page 287
Later that year, YouTube announced that it had made changes to its algorithm aimed at reducing 'the spread of borderline content and harmful misinformation." But some of those changes had already been in effect when we'd done our reporting, raising questions about their effectiveness. The company touted a somewhat oblique metric for success: 'A 70% drop in watch time of this content coming from non-subscribed recommendations in the U.S."
Page 290
YouTube had cultivated an enormous audience of viewers who had never sought the content out, but rather were pulled into it by the platform's recommendations.
Page 290
'As they get desensitized to those pictures, if they're on that scale," Rogers said, 'then they're going to seek out stuff that's even more thrilling, even more titillating, even more sexualized."
Page 292
It seemed as if YouTube was trying to tidy up without acknowledging there had been anything to tidy.
Twelve: Infodemic
Page 320
As in so many cases before, whether with incels or Boogaloos, what had begun as online bluster for the sake of finding community amid disorientation became, on platforms that rewarded escalation and created a false sense of consensus around the most extreme views, a sincere will to action.
Page 326
There was, if not a sea change in the Valley, then at least a flash of reckoning.
Page 326
The day after the riot, Facebook announced it would block Trump from using its services at least until the inauguration two weeks later.
Page 326
too. YouTube, the last major holdout, followed four days later.
epilogue: whistleblowing
Page 329
A WINDOW OPENED in the weeks after the Capitol siege. Unlike in the faltered reckonings of 2016 and 2018, there was, finally, broad understanding of social media's consequences.
Page 329
But the window quickly closed.
Page 329
The social media giants were invested too deeply in status quo financial and ideological models for such radical change.
Page 331
Facebook and others shifted from promising that they had learned their lesson and would finally change to insisting, even more stridently than they had before January 6, that all the evidence pointing to their responsibility was simply wrong.
Page 333
Australians could, of course, access news or government websites directly. Still, Facebook had, by deliberate design, made itself essential, training users to rely on its platform as the end-all for news and information.
Page 336
Collectively, the documents told the story of a company fully aware that its harms sometimes exceeded even critics' worst assessments. At times, the reports warned explicitly of dangers that later became deadly, like a spike in hate speech or in vaccine misinformation, with plenty of notice for the company to have acted and, had it not refused to do so, possibly saved lives.
Page 338
Coercing the companies into regulating themselves is also an uncertain path.
Page 338
When asked what would most effectively reform both the platforms and the companies overseeing them, Haugen had a simple answer: turn off the algorithm.