Likewar: The Weaponization of Social Media

The War Begins
  • Page 8 With careful editing, an indecisive firefight could be recast as a heroic battlefield victory. ... videos and images moved faster than the truth. ... the abrupt fall of Mosul showed that there was another side to computerized war. The Islamic State, which had no real cyberwar capabilities to speak of, had just run a military offensive like a viral marketing campaign and won a victory that shouldn't have been possible. It hadn't hacked the network; it had hacked the information on it.
  • Page 9 In the Syrian civil war where ISIS first roared to prominence, nearly every rebel group used YouTube to recruit, fundraise, and train. ... Just as the modern internet had "disrupted" the worlds of entertainment, business, and dating, it was now disrupting war and politics. It was a revolution that no leader, group, army, or nation could afford to ignore.
  • Page 12 Much of this violence starts with gangs' use of social media to "cybertag" and "cyberbang." Tagging is an update of the old-school practice of spray-painting graffiti to mark territory or insult a rival. The "cyber" version is used to promote your gang or to start a flame war by including another gang's name in a post or mentioning a street within a rival gang's territory. These online skirmishes escalate quickly. Anyone who posts about a person or a street belonging to a rival gang is making an online show of disrespect. Such a post is viewed as an invitation to "post up," or retaliate.
  • Page 13 social media creates a new reality "no longer limited to the perceptual horizon," in which an online feud can seem just as real as a face-to-face argument.
  • Page 13 anyone can start a feud online, but everyone has a collective responsibility to make sure it gets consummated in the real world.
  • Page 15 Diplomacy has become less private and policy-oriented and more public and performative.
  • Page 16 And it's not just diplomats. For the first time, entire populations have been thrown into direct and often volatile contact with each other. Indians and Pakistanis have formed dueling "Facebook militias" to incite violence and stoke national pride.
  • Page 16 Even in authoritarian states, war has never been so democratic.
  • Page 16 They steered discussion, sowed doubt, and obfuscated truth, launching the most politically consequential information attack in history. And that operation continues to this day.
  • Page 18 Modern warfare has seen numerous efforts to target and drain an enemy's spirit, almost never with success.
  • Page 18 Attacking an adversary's most important center of gravity -- the spirit of its people -- no longer requires massive bombing runs or reams of propaganda. All it takes is a smartphone and a few idle seconds. And anyone can do it.
  • Page 18 Opposing soldiers on a battlefield might find each other online and then "like" or troll their foes.
  • Page 19 From the world's most powerful nations to the pettiest flame war combatants, all of today's fighters have turned social media into a weapon in their own national and personal wars, which often overlap. They all fight to bend the global information environment to their will. The internet, once a light and airy place of personal connection, has since morphed into the nervous system of modern commerce. It has also become a battlefield where information itself is weaponized.
  • Page 20 while the truth is more widely available than ever before, it can be buried in a sea of "likes" and lies.
  • Page 21 Narrative, emotion, authenticity, community, and inundation are the most effective tools of online battles, and their mastery guides the efforts of most successful information warriors.
  • Page 21 On a network of billions, a tiny number of individuals can instantly turn the tide of an information war one way or another, often unintentionally.
  • Page 21 First, the internet has left adolescence. Over decades of growth, the internet has become the preeminent medium of global communication, commerce, and politics.
  • Page 22 This pattern resembles the trajectory of the telegraph, telephone, radio, and television before it. But the rise of social media has allowed the internet to surpass those revolutions. It is now truly global and instantaneous -- the ultimate combination of individual connection and mass transmission.
  • Page 22 the internet has become a battlefield. As integral as the internet has become to business and social life, it is now equally indispensable to militaries and governments, authoritarians and activists, and spies and soldiers.
  • Page 22 Third, this battlefield changes how conflicts are fought.
  • Page 22 Fourth, this battle changes what "war" means. Winning these online battles doesn't just win the web, but wins the world. Each ephemeral victory drives events in the physical realm, from seemingly inconsequential celebrity feuds to history-changing elections.
  • Page 22 Fifth, and finally, we're all part of this war. If you are online, your attention is like a piece of contested territory, being fought over in conflicts that you may or may not realize are unfolding around you. Everything you watch, like, or share represents a tiny ripple on the information battlefield, privileging one side at the expense of others. Your online attention and actions are thus both targets and ammunition in an unending series of skirmishes. Whether you have an interest in the conflicts of LikeWar or not, they have an interest in you. Every Wire a Nerve
  • Page 25 But using the internet isn't really the same as understanding it. The "internet" isn't just a series of apps and websites. Nor is it merely a creature of the fiber-optic cable and boxy servers that provide its backbone. It is also a galaxy of billions of ideas, spreading through vast social media platforms that each pulse with their own entropic rhythm. At the same time, it is a globe-spanning community vaster and more diverse than anything before it, yet governed by a handful of Silicon Valley oligarchs.
  • Page 28 By 1450, he was peddling his mass-produced Bibles across Germany and France. Predictably, the powers of the day tried to control this disruptive new technology. The monks and scribes, who had spent decades honing their hand-copying techniques, called for rulers to ban it, arguing that mass production would strangle the "spirituality" of the copying process. ... In what would become a familiar pattern, the new technology transformed not just communications but war, politics, and the world. ... The technology would also create new powers in society and place the old ones under unwelcome scrutiny.
  • Page 29 the spread of information, true or false, was limited by the prevailing transportation of the day. ... The world changed decisively in 1844, the year Samuel Morse successfully tested his telegraph (from the Latin words meaning "far writer"). By harnessing the emerging science of electricity, the telegraph ended the tyranny of distance.
  • Page 30 revolution. By 1850, there were 12,000 miles of telegraph wire and some 20 telegraph companies in the United States alone. ... By 1880, there would be 650,000 miles of wire worldwide -- 30,000 miles under the ocean -- that stretched from San Francisco to Bombay. ... Like the printing press before it, the telegraph quickly became an important new tool of conflict, which would also transform it.
  • Page 31 This intimacy could be manipulated, however. A new generation of newspaper tycoons arose, who turned sensationalism into an art form, led by Harvard dropout turned newspaper baron William Randolph Hearst. ... the kind of wild rumormongering American readers couldn't get enough ... The electric wire of the telegraph, though, could only speak in dots and dashes. To use them required not just the infrastructure of a telegraph office, but a trained expert to operate the machine and translate its coded messages for you. ... the telephone in 1876. ... Within a year of its invention, the first phone was put in the White House. ... The telephone also empowered a new class of oligarchs. ... Telegraphs and phones had a crucial flaw, though. They shrank the time and simplified the means by which a message could travel a great distance, but they did so only between two points, linked by wire.
  • Page 32 The first radio "broadcast" took place in 1906, when an American engineer played "O Holy Night" on his violin. ... By 1924, there were an estimated 3 million radio sets and 20 million radio listeners in the United States alone. ... But radio also unleashed new political horrors.
  • Page 33 Just like the telegraph, radio would be used to foment war and become a new tool for fighting ... Hitler told his generals, ... The victor will not be asked whether he told the truth." ... The first working television in 1925 showed the face of a ventriloquist's dummy named Stooky Bill. From these humble beginnings, television soon rewired what people knew, what they thought, and even how they voted. By 1960, television sets were in nine of ten American homes, ... With a limited number of broadcasts to choose from, millions of families watched the same events and news anchors; they saw the same shows and gossiped eagerly about them the next day.
  • Page 37 ARPANET's original function had been remote computer use and file transfer, but soon email was devouring two-thirds of the available bandwidth.
  • Page 37 At precisely 11:44 A.M. EST on September 19, 1982, computer scientist Scott Fahlman changed history forever. In the midst of an argument over a joke made on email, he wrote: I propose that [sic] the following character sequence for joke markers: :-) Read it sideways. Actually, it is probably more economical to mark things that are NOT jokes, given current trends. For this, use :-(
  • Page 37 Even the early social platforms these computer scientists produced were just digital re-creations of old and familiar things: the postal service, bulletin boards, and newspapers.
  • Page 39 In 1990, there were 3 million computers connected to the internet. Five years later, there were 16 million. That number reached 360 million by the turn of the millennium.
  • Page 39 When Netscape went public in 1995, the company was worth $3 billion by the end of its first day, despite having never turned a profit. At that moment, the internet ceased to be the plaything of academics.
  • Page 40 In early 1994, a ragtag force of 4,000 disenfranchised workers and farmers rose up in Mexico's poor southern state of Chiapas. They called themselves the Zapatista National
  • Page 40 Liberation Army (EZLN).
  • Page 40 as the Mexican military stood ready to crush the remnant -- the government declared a sudden halt to combat.
  • Page 40 But upon closer inspection, there was nothing conventional about this conflict. More than just fighting, members of the EZLN had been talking online. They shared their manifesto with like-minded leftists in other countries, declared solidarity with international labor movements protesting free trade (their revolution had begun the day the North American Free Trade Agreement, or NAFTA, went into effect), established contact with international organizations like the Red Cross, and urged every journalist they could find to come and observe the cruelty of the Mexican military firsthand. Cut off from many traditional means of communication, they turned en masse to the new and largely untested power of the internet.
  • Page 41 Everywhere, there were signs that the internet's relentless pace of innovation was changing the social and political fabric of the real world. ... In 1996, Manuel Castells, among the world's foremost sociologists, made a bold prediction: "The internet's integration of print, radio, and audiovisual modalities into a single system promises an impact on society comparable to that of the alphabet."
  • Page 41 [In] 1999, musician David Bowie sat for an interview with the BBC. ... "Up until at least the mid-1970s, we really felt that we were still living under the guise of a single, absolute, created society -- where there were known truths and known lies and there was no kind of duplicity or pluralism about the things that we believed in," the artist once known as Ziggy Stardust said. "[Then] the singularity disappeared. And that I believe has produced such a medium as the internet, which absolutely establishes and shows us that we are living in total fragmentation."
  • Page 43 As the internet went commercial and growth exploded, people started to explore how to profit from our willingness, and perhaps our need, to share.
  • Page 45 Increasingly, however, websites could process user commands; access and update vast databases; and even customize users' experience based on hundreds or thousands of variables. ... The internet was becoming not just faster but more visual. It was both user-friendly and, increasingly, user-controlled. Media entrepreneur Tim O'Reilly dubbed this new, improved internet "Web 2.0."
  • Page 47 Although nobody knew it at the time, the introduction of the iPhone also marked a moment of destruction. Family dinners, vacations, awkward elevator conversations, and even basic notions of privacy -- all would soon be endangered by the glossy black rectangle Jobs held triumphantly in his hand. ... A year later, Apple officially unveiled its App Store. This marked another epochal shift. For more than a decade, a smartphone could be used only as a phone, calculator, clock, calendar, and address book. Suddenly, the floodgates were thrown open to any possibility, as long as they were channeled through a central marketplace.
  • Page 48 By 2013, there were some 2 billion mobile broadband subscriptions worldwide; by 2018, 6 billion. By 2020, that number is expected to reach 8 billion. ... In the United States, where three- quarters of Americans own a smartphone, these devices have long since replaced televisions as the most commonly used piece of technology. ... was the network, rather than the content on it, that mattered.
  • Page 49 Soon enough, Twitter was transforming the news -- not just how it was experienced (as with Michael Jackson's death in 2009), but how it was reported. Journalists took to using social media to record notes and trade information, sharing the construction of their stories in real time. ... The social network had become where people decided what merited news coverage and what didn't. ... Twitter also offered a means for those being reported on to bypass journalists. Politicians and celebrities alike turned to it to get their own messages out. ... Blistering advancements in smartphone camera quality and mobile bandwidth also began to change what a social network could look like. ... By 2017, Instagram was adding more than 60 million photographs to its archives each day.
  • Page 50 As Tim Berners-Lee has written, "The web that many connected to years ago is not what new users will find today. What was once a rich selection of blogs and websites has been compressed under the powerful weight of a few dominant platforms. This concentration of power creates a new set of gatekeepers, allowing a handful of platforms to control which ideas and opinions are seen and shared . . . What's more, the fact that power is concentrated among so few companies has made it possible to [weaponize] the web at scale." ... WeChat, a truly remarkable social media model, arose in 2011, unnoticed by many Westerners. Engineered to meet the unique requirements of the enormous but largely isolated Chinese internet, WeChat may be a model for the wider internet's future. Known as a "super app," it is a combination of social media and marketplace, the equivalent of companies like Facebook, Twitter, Amazon, Yelp, Uber, and eBay all fused into one, sustaining and steering a network of nearly a billion users.
  • Page 51 It is an app so essential to modern living that Chinese citizens quite literally can't do without it: they're not allowed to delete their accounts. ... According to U.S. National Intelligence Council estimates, more people in sub-Saharan Africa and South Asia have access to the internet than to reliable electricity.
  • Page 52 This is what the internet has become. It is the most consequential communications development since the advent of the written word. Yet, like its precursors, it is inextricably tied to the age-old human experiences of politics and war. Indeed, it is bound more closely than any platform before it. For it has also become a colossal information battlefield, one that has obliterated centuries' worth of conventional wisdom about what is secret and what is known. It is to this revolution that we turn next.
  • Page 54 When the internet first began to boom in the 1990s, internet theorists proclaimed that the networked world would lead to a wave of what they called "disintermediation." They described how, by removing the need for "in-between" services, the internet would disrupt all sorts of longstanding industries.
  • Page 55 No longer did a reporter need to be a credentialed journalist working for a major news organization. A reporter could be anyone who was in the right place at the right time.
  • Page 57 the web-driven radical transparency that was just starting to change how information was gathered and shared -- even the nature of secrecy itself. ... Some sensors are self-evident, like the camera of a smartphone. Others lurk in the background, like the magnometer and GPS that provide information about direction and location. These billions of internet-enabled devices, each carrying multiple sensors, are on pace to create a world of almost a trillion sensors. ... Each tweet posted on Twitter, for instance, carries with it more than sixty-five different elements of metadata.
  • Page 58 The amount of data being gathered about the world around us and then put online is astounding. ... an interesting tidbit might lie in the technical background. Exercise apps have inadvertently revealed everything from the movements of a murderer committing his crime to the location of a secret CIA "black site" facility in the Middle East. (A heat map made from tracing agents' daily jogs around the perimeter of their base provided a near-perfect outline of one installation.)
  • Page 59 what stands out about all this information is not just its massive scale and form. It is that most of it is about us, pushed out by us. ... we are now our own worst mythological monsters -- not just watchers but chronic over-sharers. ... At the current pace, the average American millennial will take around 26,000 selfies in their lifetime.
  • Page 60 The first sitting world leader to use social media was Canadian prime minister Stephen Harper in 2008, followed quickly by U.S. president Barack Obama. ... the United States, the Army, Navy, Air Force, and Marine Corps all have an official social media presence. So do their bases, combat units, generals, and admirals. ... The result of all this sharing is an immense, endlessly multiplying churn of information and viewpoints. ... According to law professor Jeffrey Rosen, the social media revolution has essentially marked "the end of forgetting."
  • Page 64 As the smoke cleared, the Mumbai attack left several legacies. It was a searing tragedy visited upon hundreds of families. It brought two nuclear powers to the brink of war. And it foreshadowed a major technological shift. Hundreds of witnesses -- some on-site, some from afar -- had generated a volume of information that might previously have taken months of diligent reporting to gather. By stitching these individual accounts together, the online community had woven seemingly disparate bits of data into a cohesive whole. It was like watching the growing synaptic connections of a giant electric brain. ... At its core, crowdsourcing is about redistributing power -- vesting the many with a degree of influence once reserved for the few.
  • Page 67 Serious reflection on the past is hijacked by the urgency of the current moment; serious planning for the future is derailed by never-ending distraction. Media theorist Douglas Rushkoff has described this as "present shock." Buffeted by a constant stream of information, many internet users can feel caught in a struggle just to avoid being swept away by the current.
  • Page 67 In a sense, everyone has become part of the news. And while people who serve to make sense of the madness still exist, the character and identity of these gatekeepers have transformed as well.
  • Page 68 Across the world, there is a new breed of journalist, empowered by the web, often referred to as the "citizen reporter."
  • Page 70 A common thread runs through all of these stories. From favela life to cartel bloodlettings to civil wars, social media has erased the distinction between citizen, journalist, activist, and resistance fighter. Anyone with an internet connection can move seamlessly between these roles. Often, they can play them all at once.
  • Page 75 "open-source intelligence" (OSINT).
  • Page 77 At its most promising, the OSINT revolution doesn't just help people parse secrets from publicly accessible information; it may also help them predict the future. ... Predata is a small company founded by James Shinn, a former CIA agent. ... Predata uses such mass monitoring to discern online patterns that might be used to project real-world occurrences. Each Sunday, it sends out a "Week Ahead" mailer, breaking down the statistical likelihood of particular contingencies based on web monitoring.
  • Page 82 it is also harder than ever to separate the truth from lies. In the right hands, those lies can become powerful weapons. The Empires Strike Back
  • Page 84 The first so-called internet revolution shook Serbia in 1996. Cut off from state media, young people used mass emails to plan protests against the regime of President Slobodan Miloševi?. ... Although the initial protests failed, they returned stronger than ever in 2000, being organized even more online.
  • Page 85 Political unrest soon rocked Syria, Jordan, Bahrain, and a dozen more nations. In Libya and Yemen, dictators who had ruled for decades through the careful control of their population and its sources of information saw their regimes crumble in a matter of days. ... Tech evangelists hailed what was soon called the Arab Spring as the start of a global movement that would end the power of authoritarian regimes around the world, perhaps forever.
  • Page 86 The Net Delusion, ... As it turned out, the Arab Spring didn't signal the first steps of a global, internet-enabled democratic movement. Rather, it represented a high-water mark. The much-celebrated revolutions began to fizzle and collapse. In Libya and Syria, digital activists would soon turn their talents to waging internecine civil wars. In Egypt, the baby named Facebook would grow up in a country that quickly turned back to authoritarian government, the new regime even more repressive than Mubarak's. ... In truth, democratic activists had no special claim to the internet. They'd simply gotten there first.
  • Page 88 For all the immensity of today's electronic communications network, the system remains under the control of only a few thousand internet service providers (ISPs), the firms that run the backbone, or "pipes," of the internet. Just a few ISPs supply almost all of the world's mobile data. ... Many of these ISPs hardly qualify as "businesses" at all. They are state-sanctioned monopolies or crony sanctuaries directed by the whim of local officials. ... Designed as an open system and built on trust, the web remains vulnerable to governments that play by different rules. ... All told, sixty-one countries so far have created mechanisms that allow for national-level internet cutoffs.
  • Page 89 These blackouts come at a cost. A 2016 study of the consequences of eighty-one instances of internet cutoffs in nineteen countries assessed the economic damage. Algeria's economy lost at least $20 million during that three-day shutdown, while a larger economy like Saudi Arabia lost $465 million from an internet shutdown in May 2016. ... A variant of this cutoff strategy is "throttling." Whereas internet blocks cut off access completely, throttling slows down connections. It allows vital online functions to continue while making mass coordination more difficult. ... Web monitoring services, for instance, have noticed that every time a protest is planned in Iran, the country's internet coincidentally and conveniently slows to a crawl.
  • Page 90 But outside of the absolute-authoritarian state of North Korea (whose entire "internet" is a closed network of about thirty websites), the goal isn't so much to stop the signal as it is to weaken it. If one has to undertake extensive research and buy special equipment to circumvent government controls, the empowering parts of the internet are no longer for the masses.
  • Page 93 There was a broader lesson, he added. Social media was a "volatile political battleground." What was said and shared -- even a hasty retweet -- carried "real-world consequences."
  • Page 94 Over time, such harsh policing of online speech actually becomes less necessary as self-censorship kicks in. Communications scholars call it the "spiral of silence." Humans continually test their beliefs against those of the perceived majority and often quietly moderate their most extreme positions in order to get along better with society as a whole. By creating an atmosphere in which certain views are stigmatized, governments are able to shape what the majority opinion appears to be, which helps steer the direction of actual majority opinion.
  • Page 95 Yet there is more. Through the right balance of infrastructure control and enforcement, digital-age regimes can exert remarkable control over not just computer networks and human bodies, but the minds of their citizens as well. No nation has pursued this goal more vigorously -- or successfully -- than China.
  • Page 103 It is not surprising that Russia would pioneer this strategy. From its birth, the Soviet Union relied on the clever manipulation and weaponization of falsehood (called dezinformatsiya), both to wage ideological battles abroad and to control its population at home. One story tells how, when a forerunner of the KGB set up an office in 1923 to harness the power of dezinformatsiya, it invented a new word -- "disinformation" -- to make it sound of French origin instead. In this way, even the origin of the term was buried in half-truths.
  • Page 106 The outcome has been an illusion of free speech within a newfangled Potemkin village. "The Kremlin's idea is to own all forms of political discourse, to not let any independent movements develop outside its walls," writes Peter Pomerantsev, author of Nothing Is True and Everything Is Possible. "Moscow
  • Page 108 Any content that grabs eyeballs and sows doubt represents a job well done. Snarky videos designed to go viral ... They spin up their audience to chase myths, believe in fantasies, and listen to faux . . . ‘experts' until the audience simply tunes out."
  • Page 110 Instead of trying to hide information from prying eyes, it remains in the open, buried under a horde of half-truths and imitations.
  • Page 110 Known as "web brigades," this effort entails an army of paid commenters (among them our charming philosophy major), who manage a vast network of online accounts. Some work in the "news division," others as "social media seeders," still others tasked with creating "demotivators": visual content designed to spread as far and quickly as possible. Unlike the 50-Cent Army of China, however, the Russian version isn't tasked with spreading positivity. ... In the words of our philosophy student's boss, his job was to sow "civil unrest" among Russia's foes. "This is information war, and it's official."
  • Page 111 Internet Research Agency, located in an ugly neo-Stalinist building in St. Petersburg's Primorsky District. They'd settle into their cramped cubicles and get down to business, assuming a series of fake identities known as "sockpuppets." The job was writing hundreds of social media posts per day, with the goal of hijacking conversations and spreading lies, all to the benefit of the Russian government. For this work, our philosophy major was paid the equivalent of $1,500 per month.
  • Page 112 The hard work of a sockpuppet takes three forms, ... One is to pose as the organizer of a trusted group. @Ten_GOP called itself the "unofficial Twitter account of Tennessee Republicans" and was followed by over 136,000 people (ten times as many as the official Tennessee Republican Party account). ... The second sockpuppet tactic is to pose as a trusted news source. With a cover photo image of the U.S. Constitution, @tpartynews presented itself as a hub for conservative fans of the Tea Party to track the latest headlines. ... Finally, sockpuppets pose as seemingly trustworthy individuals: a grandmother, a blue-collar worker from the Midwest, a decorated veteran, providing their own heartfelt take on current events (and who to vote for).
  • Page 113 By cleverly leveraging readers' trust, these engineers of disinformation induced thousands -- sometimes millions -- of people each day to take their messages seriously and spread them across their own networks via "shares" and retweets. ... These messages gained even greater power as they reached beyond social media, taking advantage of how professional news outlets -- feeling besieged by social media -- had begun embedding the posts of online "influencers" in their own news stories.
  • Page 116 A 2018 study from Oxford University's Computational Propaganda Research Project found that, all told, at least forty-eight regimes have followed this new model of censorship to "steer public opinion, spread misinformation, and undermine critics." The Unreality Machine
  • Page 118 When all think alike, no one thinks very much. -- WALTER LIPPMANN, The Stakes of Diplomacy
  • Page 119 The Macedonians were awed by Americans' insatiable thirst for political stories. Even a sloppy, clearly plagiarized jumble of text and ads could rack up hundreds of thousands of "shares." The number of U.S. politics–related websites operated out of Veles ballooned into the hundreds. ... As one 17-year-old girl explained at the nightclub, watching the teen tycoons celebrate from her perch at the bar, "Since fake news started, girls are more interested in geeks than macho guys." ... As with their peddling of fad diets, the boys turned to political lies for the sole reason that this was what their targets seemed to want. "You see they like water, you give water," said Dmitri. "[If] they like wine, you give wine." There was one cardinal rule in the business, though: target the Trumpkins. It wasn't that the teens especially cared about Trump's political message, but, as Dmitri explained, "nothing [could] beat" his supporters when it came to clicking on their made-up stories.
  • Page 120 Indeed, the single most popular news story of the entire election -- " Pope Francis Shocks World, Endorses Donald Trump for President" -- was a lie fabricated in Macedonia before blasting across American social networks. ... At the same time that governments in Turkey, China, and Russia sought to obscure the truth as a matter of policy, the monetization of clicks and "shares" -- known as the "attention economy" -- was accomplishing much the same thing.
  • Page 122 Eli Pariser described the effect, and its dangerous consequences, in his 2011 book, The Filter Bubble. ... Yet, even as social media users are torn from a shared reality into a reality-distorting bubble, they rarely want for company. With a few keystrokes, the internet can connect like-minded people over vast distances and even bridge language barriers. ... social media guarantees that you can find others who share your views.
  • Page 123 It is all about us, or rather our love of ourselves and people like us. This phenomenon is called "homophily," meaning "love of the same." Homophily is what makes humans social creatures, able to congregate in such large and like-minded groups. It explains the growth of civilization and cultures. It is also the reason an internet falsehood, once it begins to spread, can rarely be stopped.
  • Page 124 The more often you hear a claim, the less likely you are to assess it critically. And the longer you linger in a particular community, the more its claims will be repeated until they become truisms -- even if they remain the opposite of the truth.
  • Page 126 Once the shared enemy was gone, wild allegations demonized former allies and drove people farther apart. ... "The speed, emotional intensity and echo-chamber qualities of social media content make those exposed to it experience more extreme reactions. Social media is particularly suited to worsening political and social polarization because of its ability to spread violent images and frightening rumors extremely quickly and intensely."
  • Page 127 Fact, after all, is a matter of consensus. Eliminate that consensus, and fact becomes a matter of opinion. Learn how to command and manipulate that opinion, and you are entitled to reshape the fabric of the world.
  • Page 134 What the stratagem revealed was that on social networks driven by homophily, the goal was to validate, not inform. ... John Herrman , 2014 ... "Content-marketed identity media speaks louder and more clearly than content-marketed journalism, which is handicapped by everything that ostensibly makes it journalistic -- tone, notions of fairness, purported allegiance to facts, and context over conclusions," he wrote. ... 59 percent of all links posted on social media had never been clicked on by the person who shared them. Simply sharing crazy, salacious stories became a form of political activism.
  • Page 138 But none of the cases were real -- and Dixson wasn't either. As Ben Nimmo, a fellow with the Digital Forensic Research Lab at the Atlantic Council, discovered, "Angee Dixson" was actually a bot -- a sophisticated computer program masquerading as a person.
  • Page 139 The most common form of this cheating is also the simplest. Fake followers and "likes" are easy to produce -- all they require is a dummy email address and social media account -- and they enjoy essentially unlimited demand.
  • Page 140 Today, social media bots spread a message; as often as not, it's human beings who become the slaves to it.
  • Page 141 These users then share the conversation with their own networks. The manufactured idea takes hold and spreads, courting ever more attention and unleashing a cascade of related conversations, and usually arguments. Most who become part of this cycle will have no clue that they're actually the playthings of machines. ... On Twitter, for instance, roughly 15 percent of its user base is thought to be fake. For a company under pressure to demonstrate user growth with each quarterly report, this is a valuable boost. ... Although botnets have been used to market everything from dish soap to albums, they're most common in the political arena. For authoritarian regimes around the world, botnets are powerful tools in their censorship and disinformation strategies.
  • Page 143 The 2016 U.S. presidential race, however, stands unrivaled in the extent of algorithmic manipulation. On Twitter alone, researchers discovered roughly 400,000 bot accounts that fought to sway the outcome of the race -- two-thirds of them in favor of Donald Trump.
  • Page 145 Originally, the three spaces were completely different.
  • Page 146 the data showed that a coordinated group of voices had entered these communities, and that these voices could be sifted out from the noise by their repeated word use. ... "Tens of thousands of bots and hundreds of human-operated, fake accounts acted in concert to push a pro-Trump, nativist agenda across all three platforms in the spring of 2016." ... The sockpuppets and bots had created the appearance of a popular consensus to which others began to adjust, altering what ideas were now viewed as acceptable to express. The repeated words and phrases soon spread beyond the fake accounts that had initially seeded them, becoming more frequent across the human users on each platform. The hateful fakes were mimicking real people, but then real people began to mimic the hateful fakes. Win the Net, Win the Day
  • Page 154 By successfully translating its seventh-century ideology into social media feeds, ISIS proved its finesse in what its supporters described as the "information jihad," a battle for hearts and minds as critical as any waged over territory. It did so through a clear, consistent message and a global network of recruiters.
  • Page 156 they'd built a "narrative." Narratives are the building blocks that explain both how humans see the world and how they exist in large groups. They provide the lens through which we perceive ourselves, others, and the environment around us.
  • Page 157 The stronger a narrative is, the more likely it is to be retained and remembered. ... By simplifying complex realities, good narratives can slot into other people's preexisting comprehension. If a dozen bad things happen to you on your way to work, you simply say you're having a "bad day," and most people will understand intuitively what you mean. The most effective narratives can thus be shared among entire communities, peoples, or nations, because they tap into our most elemental notions.
  • Page 158 The challenge now is thus more how to build an effective narrative in a world of billions of wannabe celebrities. The first rule is simplicity. ... In 2000, the average attention span of an internet user was measured at twelve seconds. By 2015, it had shrunk to eight seconds -- slightly less than the average attention span of a goldfish. An effective digital narrative, therefore, is one that can be absorbed almost instantly.
  • Page 159 This explains why so many modern narratives exist at least partially in images. Pictures are not just worth the proverbial thousand words; they deliver the point quickly. ... The second rule of narrative is resonance. Nearly all effective narratives conform to what social scientists call "frames," products of particular language and culture that feel instantly and deeply familiar. ... A resonant narrative is one that fits neatly into our preexisting story lines by allowing us to see ourselves clearly in solidarity with -- or opposition to -- its actors.
  • Page 160 According to a study by the Pew Research Center, the more unyieldingly hyperpartisan a member of Congress is -- best fitting our concept of the characters in a partisan play -- the more Twitter followers he or she draws. ... The third and final rule of narrative is novelty. Just as narrative frames help build resonance, they also serve to make things predictable. Too much predictability, though, can be boring, especially in an age of microscopic attention spans and unlimited entertainment. The most effective storytellers tweak, subvert, or "break" a frame, playing with an audience's expectations to command new levels of attention.
  • Page 160 These three traits -- simplicity, resonance, and novelty -- determine which narratives stick and which fall flat. It's no coincidence that everyone from far-right political leaders to women's rights activists to the Kardashian clan speaks constantly of "controlling the narrative."
  • Page 161 "When we do not know, or when we do not know enough, we tend always to substitute emotions for thoughts." ... What captures the most attention on social media isn't content that makes a profound argument or expands viewers' intellectual horizons. Instead, it is content that stirs emotions. ... Or, in simpler terms, content that can be labeled "LOL,""OMG," or "WTF."
  • Page 162 [In] 2013, Chinese data scientists conducted an exhaustive study of conversations on the social media platform Weibo. Analyzing 70 million messages spread across 200,000 users, they discovered that anger was the emotion that traveled fastest and farthest through the social network -- and the competition wasn't even close. "Anger is more influential than other emotions like joy," the researchers bluntly concluded. ... "Emotional contagion occurs without direct interaction between people," the scientists concluded, "and in the complete absence of nonverbal cues." Just seeing repeated messages of joy or outrage was enough to make people feel those emotions themselves. ... When an issue has two sides -- as it almost always does -- it can resemble a perpetual-motion machine of outrage.
  • Page 163 Although the word "troll" conjures images of beasts lurking under bridges and dates back to Scandinavian folklore, its modern internet use actually has its roots in the Vietnam War.
  • Page 164 "Trolling for newbies" became a sport in which experienced users would post shamelessly provocative questions designed to spark the ire of new (and unwitting) users. ... Today, we know trolls as those internet users who post messages that are less about sharing information than spreading anger. Their specific goal is to provoke a furious response. ... 1946 by the French philosopher Jean-Paul Sartre in describing the tactics of anti-Semites: They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words . . . They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert.
  • Page 165 just like conspiracy theories, the more the anger spreads, the more internet users are made susceptible to it. ... "Our findings suggest that trolling . . . can be contagious, .... There's no doubt that trolling makes the internet a worse place. Trolling targets livelihoods and ruins lives. ... But the worst online trolling doesn't necessarily stay online. ... trolling too often ends in real-life violence and tragedy. Or it can yield political power.
  • Page 167 Achieving a sense of authenticity has become an important milestone for any online operation. In bland corporate jargon, this is called "brand engagement" -- extending an organization's reach by building a facsimile of a relationship between an impersonal brand and its followers.
  • Page 169 The term "community" connotes a group with shared interests and identities that, importantly, make them distinct from the wider world. In the past, a community resided in a specific location. Now it can be created online, including (and perhaps especially) among those who find a common sense of fellowship in the worst kinds of shared identities that exclude others.
  • Page 170 As these extremists have banded together, they have carved out online spaces where they are encouraged and empowered to "be themselves." They have found warmth and joy in each other's company, even as they advocate for the forced deportation of those whose skin color or religion is different from their own.
  • Page 173 Kevin Madden
  • Page 173 "Trump understands one important dynamic: In a world where there is a wealth of information, there is always a poverty of attention, and he has this ability to generate four or five story lines a day . . . He is always in control."
  • Page 174 "Poe's Law." This is an internet adage that emerged from troll-infested arguments on the website Christian Forums. The law states, "Without a winking smiley or other blatant display of humor, it is utterly impossible to parody a [fundamentalist] in such a way that someone won't mistake it for the genuine article." ... [In] other words, there is a point at which the most sincere profession of faith becomes indistinguishable from a parody; where a simple, stupid statement might actually be considered an act of profound meta-irony. ... From the beginning, many of these lifelong trolls found something to admire in Trump. Part ... most of all, they liked Trump because, in the fast-talking, foulmouthed, combative billionaire, they saw someone just like them -- a troll.
  • Page 175 The collective efforts of Trump's troll army helped steer the online trends that shaped the election. They dredged up old controversies, spun wild conspiracy theories that Trump's opponents had to waste valuable political capital fighting off, and ensured that the most impactful attacks continued to fester and never left public attention. Although neither presidential candidate ... was well liked, an analysis, by the firm Brandwatch, of tens of millions of election-related tweets showed a near-constant decline in the number of messages that spoke positively about Clinton.
  • Page 177 A UK-based firm that Breitbart chairman and Trump campaign CEO Steve Bannon had helped form in 2013, Cambridge Analytica had previously been active in conducting information warfare–style efforts on behalf of clients ranging from corporations to the "Leave" side of Brexit. ... The researchers had concluded that it took only ten "likes" to know more about someone than a work colleague knew and just seventy to know more than their real-world friends.
  • Page 178 Importantly, the wealth of data didn't just allow a new kind of micro-targeting of voters, with exactly the message they cared most about, but it also provided new insights into how to tailor that message to influence them most.
  • Page 179 Importantly, BuzzFeed's model didn't depend on handcrafting any particular item to go viral; it depended on throwing out dozens of ideas at once and seeing what stuck. ... For every major viral success, like "12 Extremely Disappointing Facts About Popular Music," there were dozens of duds, like "Leonardo DiCaprio Might Be a Human Puppy." ... What mattered most was scale and experimentation, inundating an audience with potential choices and seeing what they picked. ... make many small bets,
  • Page 180 But the fact that these lessons are now available to anyone means that not all online battles will be one-sided blitzkriegs. As more and more users learn them, the results are vast online struggles that challenge our traditional understanding of war. LikeWar
  • Page 182 Arquilla and Ronfeldt went further. They also predicted that cyberwar would be accompanied by something else: "netwar." They explained: It means trying to disrupt, damage, or modify what a target population "knows" or thinks it knows about itself and the world around it. A netwar may focus on public or elite opinion, or both. It may involve public diplomacy measures, propaganda and psychological campaigns, political and cultural subversion, deception of or interference with the local media . . . In other words, netwar represents a new entry on the spectrum of conflict that spans economic, political, and social as well as military forms of "war."
  • Page 183 Like most theories about the early internet, however, the rhetoric ran far ahead of what was happening in the real world. ... Instead, early netwar became the province of far-left activists and democratic protesters, beginning with the 1994 Zapatista uprising in Mexico and culminating in the 2011 Arab Spring. ... In time, terrorists and far-right extremists also began to gravitate toward netwar tactics. ... "Our hope was at the least, there would be a balance between the two," Arquilla told us, with the benefit of twenty years' reflection. "I think what we've seen is a greater prevalence of the darker side of Janus. I'm troubled to see that."
  • Page 184 Today, online battles are no longer just the stuff of science fiction or wonky think tank reports, but an integral part of global conflict. As a result, governments around the world have begun to adapt to it.
  • Page 185 But there's a second revolution at work -- even stranger and more pressing than the one foreseen by Arquilla and Ronfeldt. As national militaries have reoriented themselves to fight global information conflicts, the domestic politics of these countries have also morphed to resemble netwars. And the two spheres have become linked. Just as rival states and conflict actors use ... the internet to manipulate and deceive, so, too, do political candidates and activists of all stripes.
  • Page 186 The realms of war and politics have begun to merge.
  • Page 186 Victory requires an appreciation of the nature of virality and the whimsical ways of the attention economy, as well as a talent for conveying narrative, emotion, and authenticity, melded with community-building and a ceaseless supply of content (inundation).
  • Page 187 Pepe was the product of an evolutionary cycle that moved at digital warp speed on the internet, piling meaning atop meaning until everyone lost track. Pepe was also the product of a conflict of reinvention and appropriation that twisted him in directions that no one might have expected. In understanding Pepe, one can understand memes, and through them the life cycle of ideas on the internet. ... Pepe became the ideal online phenomenon -- popular and endlessly adaptable, while remaining too weird and unattractive to ever go fully mainstream. ... What began as a mockery of political activism soon became, for many of these users, a serious effort to help Trump win. At the same time, clusters of traditional
  • Page 187 Trump supporters began to adopt the same mannerisms and tactics as actual trolls. As a result, Pepe underwent another transformation. The meme was still dumb and irreverent, but now he was suffused with political meaning as well.
  • Page 188 Pepe had entered the real world, with real consequences. ... When Trump won, Pepe transformed again. The green frog became representative of a successful, hard-fought campaign -- one that now controlled all the levers of government.
  • Page 189 Had Pepe really been racist? The answer is yes. Had Pepe been an innocent, silly joke? Also, yes. In truth, Pepe was a prism, a symbol continually reinterpreted and repurposed by internet pranksters, Trump supporters, liberal activists, ultranationalists, and everyone who just happened to glimpse him in passing. Pepe was a "meme," an empty vessel, like the chromatin that shields DNA; a protective layer over a rich, ever-multiplying strand of ideas. ... just as biological life had to ceaselessly copy itself in order to survive, ideas had to do so, too. In his 1976 book The Selfish Gene, evolutionary biologist Richard Dawkins put a name to these bits of organic, self- multiplying information: "memes."
  • Page 191 A perfect illustration came when the Center for Naval Analyses, a U.S. military–funded think tank, released a report titled "Exploring the Utility of Memes for U.S. Government Influence Campaigns." Naturally, its cover was a meme of Toy Story's Buzz Lightyear.
  • Page 192 the power of virality -- the need to produce and propel viral content through the online system. ... the content that goes viral -- the meme -- can be quite easily hijacked. And whoever does that best determines what reality looks like:
  • Page 202 armed confrontations have become inextricably linked to internet trolling. ... It is about persuading someone to back off before the first punch is thrown. ... Failing that, it's about weakening and embarrassing them, sapping their supporters while energizing your own.
  • Page 206 [The strategy is always] to dismiss, distort, distract, and dismay, divide.
  • Page 208 [There is a ] new, potent kind of information conflict that has infested war and politics alike. Instead, it is merely emblematic of larger truths in the social media age. Bots, trolls, and sockpuppets can invent new "facts" out of thin air; homophily and confirmation bias ensure that at least a few people will believe them. On its own, this is grim enough, leading to a polarized society and a culture of mistrust. But clever groups and governments can twist this phenomenon to their own ends, using virality and perception to drag their goals closer within reach.
  • Page 208 Call it disinformation or simple psychological manipulation. The result is the same, summarized best by the tagline of the notorious conspiracy website Infowars: "There's a war on . . . for your mind!" ... These offensives abide by two basic principles. The first is believability. Engineered falsehoods work best when they carry what seems like a grain of truth. They play on existing prejudices, seeking to add just one more layer to a narrative that already exists in the target's mind.
  • Page 209 social media's very form lowers the credibility threshold even more: whatever the news, if it comes from friends and family, it is inherently more believable. ... The second principle of these stealthy information campaigns is extension. ... The most devastating falsehoods are those that extend across vast numbers of people as well as across time. They spread by how they linger, formed in such a way that the very act of denial breathes new life into the headline, helping it burrow deeper into the collective consciousness.
  • Page 210 tens of thousands of Twitter users all screaming at the same nonexistent adversary.
  • Page 211 And in this sort of war, Western democracies find themselves at a distinct disadvantage. Shaped by the Enlightenment, they seek to be logical and consistent. Built upon notions of transparency, they seek to be accountable and responsible. These are the qualities that made them so successful, the form of government that won both world wars and the contest of superpowers in the last century. Unfortunately, they are not the values of a good troll,
  • Page 214 As modern warfare turns increasingly on the power of internet operations, it renders everyone a potential online combatant.
  • Page 214 As two or more online adversaries fight over the fate of that content -- your individual choice whether to amplify, expand, suppress, or distort it -- even a single "like" or retweet becomes a meaningful action in an ever-evolving information war.
  • Page 217 The door is being slowly opened to a bizarre but not impossible future where the world's great powers might fall to bloodshed due -- in part -- to matters getting out of hand online. Masters of the Universe
  • Page 219 Yes, the website that would become the video archive of the human race was launched by an errant nip-slip. Yet the strangest part of the story wasn't how unusual it was, but rather how typical. ... the DNA of the social media ecosystem: nearly universally male, white, drawn from America's upper middle class, and dedicated, at least initially, to attacking narrow problems with equally narrow solutions. Despite their modest, usually geeky origins, these founders now rule digital empires that dictate what happens in politics, war, and society at large. It has been an uneasy reign as they come to grips with what it means to rule their kingdoms.
  • Page 222 This "engineering first" mentality applies to both problems and potential solutions. Whenever these companies have had to reckon with a political, social, or ethical dilemma -- ironically spawned by their platforms' very success -- they often grasp for another new technology to solve it. As a senior executive at one of these companies put it to us, "If we could use code to solve all the world's problems, we would."
  • Page 223 Should these companies restrict the information that passes through their servers? What should they restrict? And -- most important for the future of both social media and the world -- how should they do it?
  • Page 225 Reno v. American Civil Liberties Union (1997) was the first and most important Supreme Court case to involve the internet. In a unanimous decision, the justices basically laughed the CDA out the door, noting that it massively violated the First Amendment. The only part of the CDA that survived was Section 230. Over the ensuing years, it would be consistently challenged and upheld. With each successful defense, its legal standing grew stronger. Outside of two specific exemptions (federal criminal law and intellectual property), the internet was mostly left to govern itself. As a result, most early corporate censorship -- more politely known as "content moderation" -- would come not because of government mandate, but to avoid involving government in the first place.
  • Page 231 In 2012, both Blogger (originally marketed as "Push-Button Publishing for the People") and Twitter ("the free speech wing of the free speech party") quietly introduced features that allowed governments to submit censorship requests on a per-country basis.
  • Page 232 a handful of Silicon Valley engineers were trying to codify and enforce a single set of standards for every nation in the world, all in an attempt to avoid scandal and controversy. As any political scientist could have told them, this effort was doomed to fail.
  • Page 235 was Twitter, not YouTube, that became terrorists' main social media haven. In a horrifying irony, terrorists who wanted to destroy freedom of speech found perfect alignment with Twitter's original commitment to freedom of speech.
  • Page 242 With each step the social media giants took as they waded deeper into political issues -- tackling terrorism, extremism, and misinformation -- they found themselves ever more bogged down by scandals that arose from the "gray areas" of politics and war. Sometimes, a new initiative to solve one problem might be exploited by a predatory government (Russia had a very different definition of "terrorism" than the United States) or well-meaning reporting systems gamed by trolls.
  • Page 243 Upton Sinclair a century earlier: "It is difficult to get a man to understand something when his salary depends on his not understanding it."
  • Page 244 AOL Community Leader Program was born. In exchange for free or reduced-price internet access, volunteers agreed to labor for dozens of hours each week to maintain the web communities that made AOL rich, ensuring that they stayed on topic and that porn was kept to a minimum. Given special screen names, or "uniforms," that filled them with civic pride, they could mute or kick out disruptive users.
  • Page 245 In 2005, AOL terminated the Community Leader Program, bestowing a free twelve-month subscription on any remaining volunteers. ... The rise and fall of AOL's digital serfs foreshadowed how all big internet companies would come to handle content moderation. ... But as companies begrudgingly accepted more and more content moderation responsibility, the job still needed to get done. Their solution was to split the chore into two parts. The first part was crowdsourced to users (not just volunteers but everyone), who were invited to flag content they didn't like and prompted to explain why. The second part was outsourced to full-time content moderators, usually contractors based overseas, who could wade through as many as a thousand graphic images and videos each day.
  • Page 246 And then there are the people who sit at the other end of the pipeline, tech laborers who must squint their way through each beheading video, graphic car crash, or scared toddler in a dark room whose suffering has not yet been chronicled and added to Microsoft's horrifying child abuse database. There are an estimated 150,000 workers in these jobs around the world, most of them subcontractors scattered across India and the Philippines. ... Professional trolls try to make the internet worse. Professional content moderators try to make it a little better.
  • Page 248 Neural networks are a new kind of computing system: a calculating machine that hardly resembles a "machine" at all. Although such networks were theorized as far back as the 1940s, they've only matured during this decade as cloud processing has begun to make them practical. Instead of rule-based programming that relies on formal logic ("If A = yes, run process B; if A = no, run process C"), neural networks resemble living brains. They're composed of millions of artificial neurons, each of which draws connections to thousands of other neurons via "synapses." Each neuron has its own level of intensity, determined either by the initial input or by synaptic connections received from neurons farther up the stream. In turn, this determines the strength of the signal these neurons send down the stream through their own dependent synapses.
  • Page 249 These networks function by means of pattern recognition.
  • Page 249 With enough neurons, it becomes possible to split the network into multiple "layers," each discovering a new pattern by starting with the findings of the previous layer. ... Each layer allows the network to approach a problem with more and more granularity. But each layer also demands exponentially more neurons and computing power. ... In 2012, engineers with the Google Brain project published a groundbreaking study that documented how they had fed a nine-layer neural network 10 million different screenshots from random YouTube videos, leaving it to play with the data on its own. As it sifted through the screenshots, the neural network -- just like many human YouTube users -- developed a fascination with pictures of cats. ... isolating a set of cat-related qualities, it taught itself to be an effective cat detector. "We never told it during the training, ‘This is a cat,'" explained one of the Google engineers. "It basically invented the concept of a cat." ... The machine simply distinguished the pattern of a cat from all "not-cat" patterns.
  • Page 250 Feed the network enough voice audio recordings, and it will learn to recognize speech. Feed it the traffic density of a city, and it will tell you where to put the traffic lights. Feed it 100 million Facebook likes and purchase histories, and it will predict, quite accurately, what any one person might want to buy or even whom they might vote for. ... In late 2017, Google announced that 80 percent of the violent extremist videos uploaded to YouTube had been automatically spotted and removed before a single user had flagged them.
  • Page 251 As we saw earlier, bots pose as humans online, pushing out rote messages. Their more advanced version, chatbots, are algorithms designed to convey the appearance of human intelligence by parroting scripts from a vast database.
  • Page 252 No matter how convincing it is, though, each chatbot is basically reciting lines from a very, very long script. ... By contrast, neural network–trained chatbots -- also known as machine-driven communications tools, or MADCOMs -- have no script at all, just the speech patterns deciphered by studying millions or billions of conversations. Instead of contemplating how MADCOMs might be used, it's easier to ask what one might not accomplish with intelligent, adaptive algorithms that mirror human speech patterns. ... In 2016, Microsoft launched Tay, a neural network–powered chatbot that adopted the speech patterns of a teenage girl. Anyone could speak to Tay and contribute to her dataset; she was also given a Twitter account. Trolls swarmed Tay immediately, and she was as happy to learn from them as from anyone else. Tay's bubbly personality soon veered into racism, sexism, and Holocaust denial. "RACE WAR NOW," she tweeted, later adding, "Bush did 9/11." After less than a day, Tay was unceremoniously put to sleep, her fevered artificial brain left to dream of electric frogs. ... Nobody, their creators included, can fully comprehend how they work. ... When there's no way to know if the network is wrong -- if it's making a prediction of the future based on past data -- users can either ignore it or take its prognostication at face value. ... The greatest danger of neural networks, therefore, lies in their sheer versatility. Smart though the technology may be, it cares not how it's used. These networks are no different from a knife or a gun or a bomb -- indeed, they're as double-edged as the internet itself.
  • Page 253 Anyone can build and train one using free, open-source tools. An explosion of interest in these systems has led to thousands of new applications. ... the network can use its mastery of a voice to approximate words and phrases that it's never heard. ... With a minute's worth of audio, these systems might make a good approximation of someone's speech patterns. With a few hours, they are essentially perfect.
  • Page 254 Neural networks can synthesize not just what we read and hear but also what we see. ... Neural networks can also be used to create deep fakes that aren't copies at all. Rather than just study images to learn the names of different objects, these networks can learn how to produce new, never-before-seen versions of the objects in question. They are called "generative networks." ... Using such technology, users will eventually be able to conjure a convincing likeness of any scene or person they or the AI can imagine. Because the image will be truly original, it will be impossible to identify the forgery via many of the old methods of detection. ... And finally, there are the MADCOMs. The inherent promise of such technology -- an AI that is essentially indistinguishable from a human operator -- also sets it up for terrible misuse. Page 255 Today, it remains possible for a savvy internet user to distinguish "real" people from automated botnets and even many sockpuppets (the Russified English helped us spot a few). Soon enough, even this uncertain state of affairs may be recalled fondly as the "good old days" -- ... Give a Twitter botnet to a MADCOM and the network might be able to distort the algorithmic prominence of a topic without anyone noticing, simply by creating realistic conversations among its many fake component selves. MADCOMs won't just drive news cycles, but will also trick and manipulate the people reacting to them. ... Matthew Chessen, a senior technology policy advisor at the U.S. State Department, doesn't mince words about the inevitable MADCOM ascendancy. It will "determine the fate of the internet, our society, and our democracy," he writes. No longer will humans be reliably in charge of the machines. Instead, as machines steer our ideas and culture in an
  • Page 256 automated, evolutionary process that we no longer understand, they will "start programming us." ... The LikeWars of tomorrow will be fought by highly intelligent, inscrutable algorithms that will speak convincingly of things that never happened, producing "proof" that doesn't really exist. They'll seed falsehoods across the social media landscape with an intensity and volume that will make the current state of affairs look quaint. ... Recent breakthroughs in neural network training hint at what will drive machine evolution to the next level, but also save us from algorithms that seek to manipulate us: an AI survival of the fittest. ... of "generative adversarial networks." ... The first network strains to create something that seems real -- an image, a video, a human conversation -- while the second network struggles to determine if it's fake.
  • Page 257 Although this process teaches networks to produce increasingly accurate forgeries, it also leaves open the potential for networks to get better and better at detecting fakes. Conclusion
  • Page 260 This duality of the social media revolution touches the rest of us, too. The evolutionary advantages that make us such dynamic, social creatures -- our curiosity, affinity for others, and desire to belong -- also render us susceptible to dangerous currents of disinformation.
  • Page 261 Regardless of how old they are, humans as a species are uniquely ill-equipped to handle both the instantaneity and the immensity of information that defines the social media age.
  • Page 261 First, for all the sense of flux, the modern information environment is becoming stable. ... Second, the internet is a battlefield. Like every other technology before it, the internet is not a harbinger of peace and understanding. Instead, it's a platform for achieving the goals of whichever actor manipulates it most effectively. Its weaponization, and the conflicts that then erupt on it, define both what happens on the internet and what we take away from ... Third, this battlefield changes how we must think about information itself. ... If something happens, we must assume that there's likely a digital record of it -- an image, video, or errant tweet -- that will surface seconds or years from now. However, an event only carries power if people also believe that it happened. ... process means that a manufactured event can have real power, while a demonstrably true event can be rendered irrelevant.
  • Page 262 Everything is now transparent, yet the truth can be easily obscured. ... Fifth, we're all part of the battle. ... For governments, the first and most important step is to take this new battleground seriously. ... authoritarian leaders have long since attuned themselves to the potential of social media, both as a threat to their rule and as a new vector for attacking their foes.
  • Page 264 Accordingly, information literacy is no longer merely an education issue but a national security imperative.
  • Page 265 Instead, part of the governance solution to our social media problem may actually be more social media, just of a different kind. ... submission of and digital voting on key issues, moving the power from the politician back to the people. ... What is common across these examples of governance via network is the use of social media to learn and involve.
  • Page 266 When someone engages in the spread of lies, hate, and other societal poisons, they should be stigmatized accordingly. ... Stopping these bad actors requires setting an example and ensuring that repeat offenders never escape the gravity of their past actions and are excluded from the institutions and platforms of power that now matter most in our society. In a democracy, you have a right to your opinion, but no right to be celebrated for an ugly, hateful opinion, especially if you've spread lie after lie. ... We must also come to grips with the new challenge of free speech in the age of social media -- what is known as "dangerous speech."
  • Page 267 It is a strange fact that the entities best positioned to police the viral spread of hate and violence are not legislatures, but social media companies.
  • Page 267 Silicon Valley must accept more of the political and social responsibility that the success of its technology has thrust upon it.
  • Page 268 Accordingly, these companies must abandon the pretense that they are merely "neutral" platform providers.
  • Page 268 In the process, Silicon Valley must also break the code of silence that pervades its own culture.
  • Page 269 effective information literacy education works by presenting the people being targeted with specific, proven instances of misinformation, encouraging them to understand how and why it worked against them.
  • Page 271 Instead, if we want to stop being manipulated, we must change how we navigate the new media environment.
  • Page 273 Social media is extraordinarily powerful, but also easily accessible and pliable. Across it play out battles for not just every issue you care about, but for the future itself. Yet within this network, and in each of the conflicts on it, we all still have the power of choice.