ENGL 8122  ※  User-Experience Research & Writitng



Filterworld: How Algorithms Flattened Culture

  • Page 1 In 1769, a civil servant in the Habsburg Empire named Johann Wolfgang Ritter von Kempelen built a device nicknamed "the Mechanical Turk."
  • Page 2 Over the two centuries since its invention, the device has become a prevalent metaphor for technological manipulation. It represents the human lurking behind the facade of seemingly advanced technology as well as the ability of such devices to deceive us about the way they work.
  • Page 3 Algorithm is usually shorthand for "algorithmic recommendations," the digital mechanisms that absorb piles of user data, push it through a set of equations, and spit out a result deemed most relevant to preset goals.
  • Page 3 Algorithmic recommendations shape the vast majority of our experiences in digital spaces by considering our previous actions and selecting the pieces of content that will most suit our patterns of behavior. They are supposed to interpret and then show us what we want to see.
  • Page 4 All of these small decisions used to be made one at a time by humans: A newspaper editor decided which stories to put on the front page,
  • Page 4 Algorithmic recommendations are the latest iteration of the Mechanical Turk: a series of human decisions that have been dressed up and automated as technological ones, at an inhuman scale and speed.
  • Page 4 The algorithm always wins.
  • Page 4 Though Filterworld has also changed politics, education, and interpersonal relationships, among many other facets of society, my focus is on culture.
  • Page 4 guiding our attention
  • Page 5 Each platform develops its own stylistic archetype, which is informed not just by aesthetic preferences but by biases of race, gender, and politics as well as by the fundamental business model of the corporation that owns it.
  • Page 6 "harmonization of tastes." Through algorithmic digital platforms like Instagram, Yelp, and Foursquare, more people around the world are learning to enjoy and seek out similar products and experiences in their physical lives.
  • Page 7 "Surveillance capitalism," as the scholar Shoshana Zuboff has labeled it, is how tech companies monetize the constant absorption of our personal data, an intensification of the attention economy.
  • Page 7 We consume what the feeds recommend to us without engaging too deeply with the material.
  • Page 7 Our natural reaction is to seek out culture that embraces nothingness, that blankets and soothes rather than challenges or surprises, as powerful artwork is meant to do. Our capacity to be moved, or even to be interested and curious, is depleted.
  • Page 9 In place of the human gatekeepers and curators of culture, the editors and DJs, we now have a set of algorithmic gatekeepers.
  • Page 9 Attention becomes the only metric by which culture is judged, and what gets attention is dictated by equations developed by Silicon Valley engineers.
  • Page 9 The outcome of such algorithmic gatekeeping is the pervasive flattening that has been happening across culture.
  • Page 9 Flatness is the lowest common denominator, an averageness that has never been the marker of humanity's proudest cultural creations.
  • Page 9 culture of Filterworld is the culture of presets, established patterns that get repeated again and again.
  • Page 10 we can determine ways to escape it and resolve the omnipresent atmosphere of anxiety and ennui that algorithmic feeds have produced. We can dispel their influence only by understanding them— by opening the cabinet of the Mechanical Turk to reveal the operator inside. Chapter 1: The Rise of Algorithmic Recommendations
  • Page 11 Algorithm as a term simply describes an equation: any formula or set of rules that produces a desired result.
  • Page 16 we're discussing a technology with a history and legacy that has slowly formed over centuries, long before the Internet existed.
  • Page 20 An executive at the music cataloging and recommendation service Pandora once described the company's system to me as an "orchestra" of algorithms, complete with a "conductor" algorithm. Each algorithm used different strategies to come up with a recommendation, and then the conductor algorithm dictated which suggestions were used at a given moment. (The only output was the next song to play in a playlist.) Different moments called for different algorithmic recommendation techniques.
  • Page 21 Recommendation algorithms as a way of automatically processing and sorting information were put into practice in the 1990s.
  • Page 22 "We need technology to help us wade through all the information to find the items we really want and need, and to rid us of the things we do not want to be bothered with."
  • Page 23 Social information filtering bypasses those problems because it is instead driven by the actions of human users, who evaluate content on their own—using judgments both quantitative and qualitative.
  • Page 23 even described it as "unnervingly accurate." Ringo's innovation was how it acknowledged that the best recommendations, or the best indications of relevance, were likely to come from other humans rather than analysis of the content itself. It represented a scaling up of human taste.
  • Page 24 PageRank worked by measuring how many times a website was linked to by other sites, similar to the way academic papers cite key pieces of past research.
  • Page 25 in the Internet era, sorting knowledge might be even more powerful. Information is now easy to find in abundance; making sense of it, knowing which information is useful, is much harder.
  • Page 26 Nick Seaver is a sociologist and a professor at Tufts University who studies recommender systems.
  • Page 27 "The algorithm is metonymic for companies as a whole," he told me. "The Facebook algorithm doesn't exist; Facebook exists. The algorithm is a way of talking about Facebook's decisions."
  • Page 31 algorithms can warp language itself as users attempt to either game them or evade detection.
  • Page 36 if people are using a platform, staying engaged and active, then it counts as successful—no matter what they are doing.
  • Page 36 it is difficult to think of creating a piece of culture that is separate from algorithmic feeds, because those feeds control how it will be exposed to billions of consumers in the international digital audience.
  • Page 36 Without the feeds, there is no audience—
  • Page 36 for a piece of culture to be commercially successful, it must already have traction on digital platforms.
  • Page 37 Under algorithmic feeds, the popular becomes more popular, and the obscure becomes even less visible.
  • Page 37 Success or failure is accelerated.
  • Page 38 Given that these capricious systems control so many facets of our lives, from socializing with our friends to building audiences for our creative projects, is it any wonder that social media users feel paranoid? We're encouraged to overlook algorithmic processes, but their glitches remind us of their unearned authority.
  • Page 38 The ambiguity of algorithmic influence creates a feeling that has been labeled "algorithmic anxiety."
  • Page 38 Airbnb forces a "double negotiation" for the hosts, the researchers wrote, because they must determine what their guests are looking for in a listing as well as which variables the algorithms are prioritizing to promote their property more often.
  • Page 39 platforms like Airbnb have long promised flexible work and alternative ways of making or supplementing a living, but they also created a new form of labor in the need to stay up to date on changes in algorithmic priorities.
  • Page 40 Algorithmic anxiety is something of a contemporary plague. It induces an OCD-ish tendency in many users toward hyperawareness and the need to repeat the same rituals, because when these rituals "work," the effect is so compelling, resulting in both a psychological dopamine rush from receiving attention and a potential economic reward if your online presence is monetized. It undergirds so many of our behaviors online: selecting the right profile picture, curating an attractive grid of photos on an Instagram account, choosing the right keywords on a marketplace listing.
  • Page 40 Exploitation is disguised as an accidental glitch instead of an intentional corporate policy. In reality, a company like Facebook is wholly in control of their algorithmic systems, able to change them at will—or turn them off. Chapter 2: The Disruption of Personal Taste
  • Page 45 It was as if you could buy only the books that appeared on the New York Times bestseller list, but the list was operated by an untrustworthy company, one solely devoted to treating books as fungible objects to be offloaded as quickly as possible.
  • Page 46 Chet Haase in 2017 pinpoints the problem: "A machine learning algorithm walks into a bar. The bartender asks, ‘What'll you have?' The algorithm says, ‘What's everyone else having?'
  • Page 48 Taste is a word for how we measure culture and judge our relationship to it. If something suits our taste, we feel close to it and identify with it, as well as form relationships with other people based on it, the way customers commune over clothing labels (either loving or hating a particular brand).
  • Page 50 If taste indeed must be deeply felt, requires time to engage with, and benefits from the surprise that comes from the unfamiliar, then it seems that technology could not possibly replicate it, because algorithmic feeds run counter to these fundamental qualities.
  • Page 51 The feed structure also discourages users from spending too much time with any one piece of content.
  • Page 51 Korean philosopher Byung-Chul Han argued in his 2017 book In the Swarm, the sheer exposure of so many people to each other online without barriers—the "demediatization" of the Internet—makes "language and culture flatten out and become vulgar."
  • Page 51 Today we have more cultural options available to us than ever and they are accessible on demand. We are free to choose anything. Yet the choice we often make is to not have a choice, to have our purview shaped by automated feeds, which may be based on the aggregate actions of humans but are not human in themselves.
  • Page 51 Taste can also feel more like a cause for concern than a source of personal fulfillment. A selection made based on your own personal taste might be embarrassing if it unwittingly clashes with the norms of the situation at hand, like wearing athleisure to the office or bright colors to a somber funeral.
  • Page 52 Over the twentieth century, taste became less a philosophical concept concerning the quality of art than a parallel to industrial-era consumerism, a way to judge what to buy and judge others for what they buy in turn.
  • Page 53 Consumption without taste is just undiluted, accelerated capitalism.
  • Page 53 There are two forces forming our tastes. As I described previously, the first is our independent pursuit of what we individually enjoy, while the second is our awareness of what it appears that most other people like, the dominant mainstream.
  • Page 53 Pierre Bourdieu wrote in his 1984 book Distinction: A Social Critique of the Judgement of Taste. These choices can be symbolic of a range of things beyond just our aesthetic preferences, such as economic class, political ideology, and social identity. "Taste classifies, and it classifies the classifier," Bourdieu wrote. No wonder that we worry about what to like, and sometimes find it simpler to export that responsibility to machines.
  • Page 55 Online, users are often insulated from views and cultures that clash with their own. The overall digital environment is dictated by tech companies with ruthlessly capitalist, expansionary motives, which do not provide the most fertile ground for culture.
  • Page 56 Part of its appeal lies in breaking with the social code: wearing something unexpected or strange, even at times challenging your own taste.
  • Page 56 On the consumer side, the bombardment of recommendations can induce a kind of hypnosis that makes listening to, watching, or buying a product all but inevitable—whether it truly aligns with your taste or not.
  • Page 58 Your engagement is tracked by digital surveillance, and then you are served ads for products that match what you engage with, from brands that pay for your attention.
  • Page 62 Fascism means being forced to conform to the tenets of a single ideological view of the world, one that may utterly discount a particular identity or demographic. It is the mandate of homogeneity. Filterworld can be fascistic,
  • Page 62 With modern-day algorithmic recommendations, artists have much less choice in what becomes popular and even less control over the context that their work appears in.
  • Page 66 The Netflix algorithm factors in a user's viewing history and ratings; the actions of other users with similar preferences; and information about the content itself, like genre, actors, and release dates. It also includes the time of day the user is watching, what device they're watching on, and how long they tend to watch in that context.
  • Page 67 The Netflix algorithm slots users into particular "taste communities," of which there are more than two thousand. And there are more than seventy-seven thousand "altgenres" or niche categories,
  • Page 69 Netflix recommendations are less about finding the content that suits a user's preferences and more about presenting what's already popular or accessible, an illusion of taste.
  • Page 71 "Over time, if people are offered things that are not aligned with their interests often enough, they can be taught what to want….
  • Page 73 But the more automated an algorithmic feed is, the more passive it makes us as consumers, and the less need we feel to build a collection, to preserve what matters to us. We give up the responsibility of collecting.
  • Page 75 our cultural collections are not wholly our own anymore.
  • Page 76 The disappearance or overhauling of a particular app throws the content gathered there to the wind.
  • Page 76 Building a collection online more closely resembles building a sandcastle on the beach:
  • Page 76 The shifting sands of digital technology have robbed our collections of their meaning.
  • Page 82 Kabvina built his own narrative arc into his TikTok account, creating a social-media-era hero's journey. He studied the most popular accounts. Influencers like Charli D'Amelio and Emily Mariko became famous in part for getting famous, starting from anonymity. "The biggest trend I'd notice is…[followers] want a protagonist to take them on this journey," Kabvina said. He also carefully optimized his cooking videos according to the data TikTok gave him. Avoiding too much speaking or text made them appealing to a global audience—his food needed no translation. (It was a successful strategy; Mariko also became famous for her speech-less cooking videos.) The TikTok app reveals to creators at which point in a video users tune out and flip to the next video.
  • Page 83 If viewers were skipping at nineteen seconds, Kabvina would go back and examine the underperforming section, and then try to avoid its problems in the next video. Such specific data allowed him to optimize for engagement at every moment.
  • Page 84 Culture is continuously refined according to the excesses of data generated by digital platforms, which offer a second-by-second record of what audiences are engaging with and when and how.
  • Page 84 This perception that culture is stuck and plagued by sameness is indeed due to the omnipresence of algorithmic feeds. Chapter 4: The Influencer Economy
  • Page 134 The tyranny of likes is in part a function of the algorithmic ecosystem we exist in online.
  • Page 134 Over time, a kind of inflation of likes occurred.
  • Page 135 Provocation inspires likes, since the like is a gesture of allegiance and agreement, a symbol of whether the user is on the side of the troll or the trolled. Outrage gets likes because the likes signal sympathetic outrage:
  • Page 138 The likes were not the only reward; they existed in a wider online attention economy that bled into the offline economy at large. Likes lead to attention. Attention leads to new followers; followers who liked and shared my work in turn. More followers led to a veneer of personal authority:
  • Page 138 And that reputation got me commissions from editors, part-time gigs, and full-time jobs, which drove me back to the beginning of that loop. Getting more likes felt like what I was supposed to be doing; it felt like work, and I was getting better at my job.
  • Page 140 commentators on contemporary culture, in 2021. "Algorithmic" has become a byword for anything that feels too slick, too reductive, or too optimized for attracting attention: a combination of high production values with little concern for fundamental content.
  • Page 141 Part of the fear of algorithmically driven art is the obviation of the artist: If viable art can be created or curated by computer, what is the point of the humans producing
  • Page 146 On one hand, this is a kind of democratization: Anyone can publish a book and give it a chance to be sold through the exact same channels, presented in the same way. There is no obstacle of a store's book buyer or the curation of a front table; just the math of the algorithm. The hyper-bestselling author Colleen Hoover provides an example of the opportunities. Hoover began by self-publishing her novels, which often fall into romance, thriller, and young-adult categories, on Amazon.
  • Page 147 On the other hand, the requirement of mass engagement is a departure from the history of literature, in which the opinions of editors and academics have mattered far more than how many copies of a book initially sells.
  • Page 148 It's that algorithms have shaped the overall cultural landscape, conditioning our tastes. Everything exists within the algorithmic context of passive, frictionless consumption.
  • Page 149 Rather than encouraging original artistic achievement, algorithmic feeds create the need for content that exists to generate more content: films that provide ready-made GIFs of climactic scenes to share on Twitter or TikTok and quippy lines that will inspire memes to serve as marketing. The need for engagement can encourage a capitulation to fanservice, or at least an attempt to do so.
  • Page 154 The superficiality of the word itself is indicative: "influence" is never the end point, only a means of communicating a particular message. An influencer is easiest to define by how they make money. Like a media company producing magazines or podcasts, they sell advertising shown to the audiences that they have gathered.
  • Page 154 audiences in in the first place is most often the influencer's personal life, their aesthetically appealing surroundings (as well as aesthetically appealing selves) and entertaining activities.
  • Page 154 influencers don't own the infrastructure of their medium.
  • Page 154 Fascination with a person, particularly their appearance or personal life, smoothing the way to self-promotion began long before the Internet era.
  • Page 155 Consumers have always cared about the lifestyle decisions of celebrities famous for something else:
  • Page 156 The influencer is something of a successor to the blogger, the star of the nascent mainstream Internet in the 2000s.
  • Page 162 While the early promise of social media was to connect users to their actual friends, over time inauthenticity became something to embrace.
  • Page 163 Individual influencers are less remarkable in this decade also because so many users of digital platforms are pressured to act like influencers themselves, constantly creating content, accruing an audience, and figuring out ways to monetize it—either immediately through literal advertising or more gradually through the attention of their peers.
  • Page 164 In Filterworld, culture has become increasingly iterative. It's harder for a creator to go straight to making a movie or publishing a book; she needs to first publish her sample material, describe her vision, and gather an audience online who are engaged fans of her work.
  • Page 164 This need to corral an audience in advance by succeeding on social media can be explained by the useful phrase "content capital." Established by the scholar Kate Eichhorn in her 2022 monograph Content,
  • Page 164 it describes the Internet-era state in which "one's ability to engage in work as an artist or a writer is increasingly contingent on one's content capital;
  • Page 164 That ancillary content might be Instagram selfies, photos of a painting studio, evidence of travel, tossed-off observations on Twitter, or a monologue on TikTok.
  • Page 164 It all builds an audience for the person, who remains a separate entity from the things that they make.
  • Page 164 the author's personal brand is now all that matters;
  • Page 164 it's the work itself that is dead.
  • Page 165 Eichhorn responds to the sociologist Pierre Bourdieu's 1970s concept of "cultural capital":
  • Page 165 Content capital, then, is fluency in digital content: the knowledge of what kinds of content to produce, how the feeds of various
  • Page 165 platforms work, what they prioritize, and how audiences might react to a given creation. Those who have more content capital gain more followers, and thus more power in the cultural ecosystem of Filterworld.
  • Page 165 more followers and more engagement are always posed as better.
  • Page 165 The primary incentive is to make the numbers go up.
  • Page 165 "One builds up one's content capital simply by hanging out online and, more precisely, by posting content that garners a response and, in turn, leads to more followers and more content,"
  • Page 166 She described that endless race: "Increasingly, what matters is simply that one is producing content and doing so at an increasingly high frequency and volume." Elsewhere in the book, Eichhorn puts it more simply and brutally: "Content begets content."
  • Page 166 exposure is not always personally affirming.
  • Page 167 it can often feel like there is no creativity without attention, and no attention without the accelerant of algorithmic recommendations.
  • Page 167 I decided long ago against fully adapting my voice to the algorithmic feed—
  • Page 167 I found that there were certain ways I could present the things I was doing to maximize my possible content capital. I labored over tweets to share my latest articles, trying to figure out what would get shared the most: a curiosity-gap headline that left a question open, perhaps, or highlighting the most dramatic quote in a story.
  • Page 168 Cultural flattening is one consequence. But the same mechanism is also what makes our public political discourse more and more extreme, because conflict and controversy light up the feed and attract likes in a way that subtlety and ambiguity never will.
  • Page 168 Over the past decade, a generation of "Insta-poets" have emerged on Instagram and sold millions of books to their followers by shaping their work to the structure and demands of the platform.
  • Page 172 There is an element of elitism at play in any evaluation that casts social media as the opposite of art.
  • Page 173 Blatant clarity and simple, literal takeaways versus linguistic difficulty and the need to accept irresolution: one aesthetic approach is not better or worse than the other; they are simply different sets of choices. Yet in Filterworld, we face a cultural environment that inevitably prioritizes the former over the latter because it travels more effectively through algorithmic feeds, and there are fewer and fewer outlets outside of those feeds available for creators to access the audiences they need to survive in such a capitalistic environment.
  • Page 173 Ultimately, the algorithmic feed may not be the death of art, but it often presents an impediment to it.
  • Page 176 There's a homogeneity to the kind of literature that influencers promote, too, narrowing down to the kinds of books that can accelerate through feeds.
  • Page 176 "The problems of homogeneity are not just that it is boring; the most or least offensive stuff rises to the top, because that gets clicks," Depp said. "This is the issue about whoever is succeeding on TikTok this week: People who have never read the book are going to make a video about it, because that's the
  • Page 176 trending topic. Things start out with genuine interest, but by the thousandth video about it, it has nothing to do with the thing itself."
  • Page 179 Hallie also realized that the Instagram feed rewarded specific qualities. She had always combined visual art and writing, but posts with clear written messages got the most engagement. "If I posted something pretty to look at, it didn't get as much of a response," she said. This effect isn't solely a consequence of the algorithmic feed; consumers have tastes that don't always mesh with an artist's own vision. But the acceleration of the feed and the instantaneousness of the feedback begets an intensified self-consciousness on the part of
  • Page 180 the artist
  • Page 180 The pressure that Hallie felt to make the rest of her artwork similarly bright, clear, and simple is much like the pressure that a musician feels to frontload the hook of a song so it succeeds on TikTok or a writer feels to have a take so hot it lights up the Twitter feed.
  • Page 181 That kind of internal creative process, or even the process of thinking on one's own, is something that feels lacking in the Filterworld era, when any idea or thought can be made instantly public and tested for engagement.
  • Page 181 The artist-as-influencer isn't introspective; she exists on the ephemeral surface of things, iterating and adapting according to reactions.
  • Page 181 Hallie's comments made me feel a kind of personal grief: Have I been left incapable of truly thinking for myself, or unwilling to do that creative work without the motivation of an invisible audience?
  • Page 182 "If I adapt to every trend, if I hop on every new platform and try to build a following there, I'm going to be building sandcastle after sandcastle. If the algorithm is failing us now, that means it was never stable. It was like a fair-weather friend."
  • Page 182 The recent history of the 2010s, with the rise and then growing irrelevance of Facebook, has shown that no social network is too big to fail or get supplanted by a competitor that chooses to play by a new set of rules, social or technological. Chapter 5: Regulating Filterworld
  • Page 183 We cannot wholly opt out while still using the digital platforms that have become necessary parts of modern adult life. Like the post office, the sewer system, or power lines, they are essential, and yet, unlike such public infrastructure, they aren't subject to government oversight or regulation, or the decisions of voters. Recommender systems run rampant.
  • Page 183 In November 2017, a fourteen- year- old student from northwest London named Molly Russell died by suicide. Russell wasn't wholly responsible for her actions,
  • Page 184 Russell's death was part of the human toll of algorithmic overreach, when content moves too quickly at too vast a scale to be moderated by hand. No magazine's editor would have published a flood of such depression content, nor would a television channel broadcast it. But the algorithmic feed could assemble an instant, on-demand collection, delivering what Russell may have found most engaging even though it was harmful for her.
  • Page 185 Though so much of the content we see online is "user-generated"—uploaded freely, without either gatekeeping or support—it still has to fit into preestablished molds determined by corporations.
  • Page 187 as I could, I ventured out of that corporatized space and found a much wider Internet that was more decentralized again. People built their own HTML websites without oversight and often without much professionalism. The web was an amateur zone made up of handmade pages espousing some particular fandom (say, the TV show Gilmore Girls) or niche hobby (building canoes) that were easy to stumble upon with early Google searches. You could use a service like Geocities, which launched in 1994, to build and host a website using basic tools, but no two Geocities pages looked the same. They were quirky collisions of animated GIFs in messy frame layouts, as though a child had made them.
  • Page 191 Already, when Facebook bought Instagram, it felt as though the walls of the Internet were closing in a little tighter around us users. The broad expanse of possibility, of messiness, on a network like Geocities or the personal expression of Tumblr was shut down. Digital life became increasingly templated, a set of boxes to fill in rather than a canvas to cover in your own image.
  • Page 192 Google similarly acquired YouTube in 2006 and turned the video-uploading site into a media-consumption juggernaut, a replacement for cable television.
  • Page 193 There's a certain amount of whiplash that comes with experiencing these cycles of the Internet. We users think we're supposed to behave one way, and then the opposite becomes true, like the movement from pseudonyms to real names. We're asked to use tools to build our own spaces, to freely express ourselves, and then commanded to fit within a preset palette determined by a social network. Yet as soon as one standard becomes dominant, it seems to lose its grip.
  • Page 193 Any joy in the new forms of expression is ruthlessly exploited, most often in the form of increased advertising.
  • Page 194 Still, the Internet in its current era has never looked more monolithic. Individual websites have been subsumed into ever-flowing feeds.
  • Page 194 Decentralization tends to give users the most agency, though it also places a higher burden of labor and responsibility on the individual.
  • Page 195 The quickest way to change how digital platforms work may be to mandate transparency: forcing the companies to explain how and when their algorithmic recommendations are working.
  • Page 195 And if we know how algorithms work, perhaps we'll be better able to resist their influence and make our own decisions.
  • Page 195 Eli Pariser's filter bubbles,
  • Page 195 traditional media can be biased into homogeneity just as well,
  • Page 196 filter bubbles earlier in this book; the phenomenon may have done more to cause the surprise of Trump's win than the fact that it happened.
  • Page 196 Trump did take advantage of algorithmic technology. His campaign used Facebook's targeted advertising program to great effect, pushing messages to voters whose online actions showed that they might be convinced by his politics.
  • Page 196 Facebook ads are often bought based on outcomes rather than how many times they are displayed; the client pays for click-throughs and conversions to actions like political donations. The Trump campaign was all but guaranteed that the algorithmic feed would work in their favor.
  • Page 196 "With recommendation algorithms, you get the same kind of things over time. How do we break those patterns?"
  • Page 197 Mike Ananny and Kate Crawford wrote in a 2016 paper in the journal New Media & Society. Knowing how and why something has been recommended might help to dispel the air of algorithmic anxiety that surrounds our online experiences, since we could identify which of our actions the recommendations are considering.
  • Page 200 Just as digital platforms aren't responsible for explaining their algorithmic feeds, they also don't take responsibility for what the feeds promote—they separate themselves from the outcomes of their recommender systems.
  • Page 200 But in the social media era, it has also allowed the tech companies that have supplanted traditional media businesses to operate without the safeguards of traditional media.
  • Page 202 Social networks displaced traditional publishers by absorbing advertising revenue;
  • Page 202 Even with their restricted circumstances, traditional media companies continue to hold responsibility for every piece of content they publish. Meanwhile, digital platforms could claim they were not media companies at all with the excuse of Section 230.
  • Page 204 If algorithmic feeds mistreat us or contribute to an abusive or exploitative online environment, we, as users and citizens, have little recourse.
  • Page 204 The justices probed the uses and capabilities of algorithmic recommendations and debated if algorithms can be considered "neutral" (I would argue they cannot),
  • Page 205 In May 2023, the Supreme Court ruled that the tech companies were not liable, and upheld the strongest interpretation of Section 230 once again.
  • Page 205 mainstream Internet, and the 2010s saw the rise and domination of massive digital platforms, then the next decade seems likely to embrace decentralization once more. Agency might be the watchword: the ability of an individual user to dictate how they publish and view content. I have hope for an Internet that's more like Geocities, with expressions of individuality and customization everywhere, but with the multimedia innovations that have made the 2020s' Internet so compelling.
  • Page 206 As Molly Russell, the British teenager who died by suicide, experienced with the avalanche of depression content, recommendations accelerate negative material as much as positive material.
  • Page 206 Facebook outsources much of its human moderation to a company called Accenture, which employs thousands of moderators around the world, including in countries like Portugal and Malaysia.
  • Page 207 The toxic material doesn't just magically vanish because of the mediation of the algorithm. Once again, the human labor is obscured.
  • Page 208 Reaching wide audiences of strangers isn't a right; it's a privilege that doesn't need to be possible for every individual user or post.
  • Page 208 The word amplification describes algorithmic recommendations' role in spreading content more widely than it would otherwise travel:
  • Page 208 Amplification is at the core of Filterworld's problems;
  • Page 210 The rise of social media has created a new set of dynamics for culture and entertainment. Users have much more choice of what to consume at a given moment, and creators have a much easier time reaching audiences by simply uploading their content to the Internet. We don't have to just watch what a producer elects to put on cable television. We have come to expect individualization, whether driven by our own actions or an algorithm. But that seemingly more democratic and low-hierarchy dynamic has also given us a sense that the old laws and regulations don't apply, precisely because we can decide when to watch or listen to something and when to choose another source. We might have more independence, but we ultimately have less protection as consumers.
  • Page 211 If recommendations are to be regulated, certain decisions have to be made based on content.
  • Page 217 As they go into effect, these laws are likely to overhaul our algorithmic landscape, giving users much more agency when it comes to recommendations and the configuration of a content feed. The passive relationship would become a more active one as we begin to figure out our own preferences and shape our digital lives to follow our own tastes. Algorithmic feeds will appear less monolithic and impenetrable, as they do now, and more like the functional tools they are. There's no reason your feed needs to work exactly like my feed. The resulting profusion could lead to more diversity of culture online, as well.
  • Page 220 regulation cannot be the only answer when it comes to culture.
  • Page 220 we also must change our own habits, becoming more aware of how we consume culture and how we can resist the passive pathways of algorithmic feeds.
  • Page 221 The most powerful choice might be the simplest one: Stop lending your attention to platforms that exploit it.
  • Page 221 the more dramatic option is to log out entirely and figure out how to sustain culture offline once more. Chapter 6: In Search of Human Curation
  • Page 224 If one form of algorithmic anxiety is about feeling misunderstood by algorithmic recommendations, another is feeling hijacked by them, feeling like you couldn't escape them if you tried.
  • Page 228 During my cleanse, I also discovered that recommender systems pop up in unexpected places. I eventually turned to the New York Times app as my primary way of checking in on news, but that app features a "For You" tab, much like TikTok's,
  • Page 229 Escaping algorithms entirely is nearly impossible.
  • Page 229 As I had hoped, I began reading long articles in a single sitting more often and left fewer tabs open in my browser, since I wasn't faced with a cascade of alternative options.
  • Page 230 By the second month of my experiment, when I had adjusted my habits, I began to feel a sense of nostalgia. It reminded me of how I interacted with the Internet as a teenager, back before mainstream social media existed.
  • Page 230 Despite my hesitancy around algorithmic feeds, I could never give up the Internet entirely, because it has brought me too much over my lifetime.
  • Page 234 On TikTok, it's harder to become a connoisseur because you have little chance to develop expertise or assemble the context of what you're looking at. You must work at it, get off the slick routes of the feed, and gradually refine the thing that you seek. The benefit of the slower, self- managed approach to culture is that it might lead to a greater appreciation of the content at hand,
  • Page 240 Curation begins with responsibility. The etymological ancestor of the word curatore was a term for ancient Roman "public officers," according to an 1875 dictionary, positions that predated the emperor Augustus, whose reign began in 27 BCE. They managed various aspects of the city's upkeep:
  • Page 240 Latin, curare meant to take care of, and curatio indicated attention and management.
  • Page 240 The word's etymology hints at the importance of curating, not just as an act of consumption, taste displaying, or even self-definition, but as the caretaking of culture, a rigorous and ongoing process.
  • Page 241 Those decades saw "the rise of the curator as creator," as the museum-studies scholar Bruce Altshuler put it in his 1994 book
  • Page 241 Avant-Garde in Exhibition.
  • Page 241 In a sense, the individual star curators are the opposite of recommendation algorithms: they utilize all of their knowledge, expertise, and experience in order to determine what to show us and how to do it, with utmost sensitivity and humanity.
  • Page 242 Yet algorithmic recommendations are also often described as "curating" a feed, even though there is no consciousness behind them.
  • Page 242 What is lost in the overuse of the word is the figure of the curator herself, a person whose responsibility it is to make informed choices, to care for the material under her purview.
  • Page 243 The Internet might have an overflow of curation, but it also doesn't have enough of it, in the sense of long-term stewardship, organization, and contextualization of content—
  • Page 247 The slow process of curation works against the contextlessness, speed, and ephemerality that characterizes the Internet.
  • Page 250 As I've written this book, independent radio DJs have stuck out in my mind as an ideal form of non-algorithmic cultural distribution.
  • Page 258 Curation is an analog process that can't be fully automated or scaled up the way that social network feeds have been. It ultimately comes down to humans approving, selecting, and arranging things.
  • Page 261 Another step toward a more curated Internet is to think more carefully about the business models that drive the platforms we use.
  • Page 265 In my conversations with curators, I found a tone of caring and caretaking that is missing entirely from massive digital platforms, which treat all culture like content to be funneled indiscriminately at high volume and which encourage consumers to stay constantly on the surface.
  • Page 266 Like tobacco companies manufacturing low-tar cigarettes, the algorithmic feeds create the problems they are marketed as solving. Conclusion
  • Page 275 Walter Benjamin completed a revised version of his essay "The Work of Art in the Age of Mechanical Reproduction."
  • Page 277 Even in the short time of their rise, algorithmic recommendations have warped everything from visual art to product design, songwriting, choreography, urbanism, food, and fashion.
  • Page 277 In terms of how culture reaches us, algorithmic recommendations have supplanted the human news editor, the retail boutique buyer, the gallery curator, the radio DJ—people whose individual taste we relied on to highlight the unusual and the innovative. Instead,
  • Page 277 we have tech companies dictating the priorities of the recommendations, which are subjugated to generating profit through advertising.
  • Page 279 Resistance to algorithmic frictionlessness requires an act of willpower, a choice to move through the world in a different way. It doesn't have to be a dramatic one.