: The Surprising Science of Belief, Opinion, and Persuasion

  • Page xiv Why do we argue? What purpose does it serve? Is all this bickering online helping or hurting us? I invited the famed cognitive scientist Hugo Mercier, an expert on human reasoning and argumentation, to be a guest on my show. He explained that we evolved to reach consensus-- sometimes on the facts, sometimes on right and wrong, sometimes on what to eat for dinner-- by banging our heads together. Groups that did a better job of reaching consensus, by both producing and evaluating arguments, were better at reaching communal goals and out- survived those that didn't.
  • Page xiv innate psychology that compels us to persuade others to see things our way when we believe our groups are misguided.
  • Page xv The fact that we so often disagree isn't a bug in human reasoning; it's a feature.
  • Page xv When the tide of public opinion turned on these issues, it shifted so quickly that if people could step into a time machine and go back just a few years, many would likely argue with themselves with the same fervor they argue about wedge issues today.
  • Page xvi the surprising psychology behind how people modify and update their beliefs, attitudes, and values; and how to apply that knowledge to whatever you believe needs changing, whether it's within one mind or a million.
  • Page xvii We will see that the speed of change is inversely proportional to the strength of our certainty, and certainty is a feeling: somewhere between an emotion and a mood, more akin to hunger than to logic. Persuasion, no matter the source, is a force that affects that feeling.
  • Page xvii As Daniel O'Keefe, a professor of communication, defines it, persuasion is "a successful intentional effort at influencing another's mental state through communication in a circumstance in which the persuadee has some measure of freedom."
  • Page xvii Persuasion is not coercion, and it is also not an attempt to defeat your intellectual opponent with facts or moral superiority, nor is it a debate with a winner or a loser. Persuasion is leading a person along in stages, helping them to better understand their own thinking and how it could align with the message at hand. You can't persuade another person to change their mind if that person doesn't want to do so, and as you will see, the techniques that work best focus on a person's motivations more than their conclusions.
  • Page xviii All persuasion is self- persuasion.
  • Page xix Why do I want to change their mind?-- in your mental backpack as you travel with me chapter by chapter. And I hope that question will blossom, as it did for me, into a series of questions.
  • Page xix But also, what does the phrase "change your mind" even mean?
  • Page xx we must avoid debate and start having conversations. Debates have winners and losers, and no one wants to be a loser. But if both sides feel safe to explore their reasoning, to think about their own thinking, to explore their motivations, we can each avoid the dead- end goal of winning an argument. Instead, we can pursue the shared goal of learning the truth.
  • Page 9 In April 2015, Charlie landed his current job, which I won't describe in much detail for the sake of his anonymity, but it involves selling properties around the world. "I'm very good at it. I can earn good money," he told me, proud that he had finally eluded his haters. "It took a while, but ultimately my six years of YouTube, or having to just rant and speak eloquently about abstract concepts, it was almost like I did six years of training. And I've developed a very thick skin. I think I am a very good salesman."
  • Page 11 Brian Greene, a physicist who studies string theory, to tell Wired, "We've come to a very strange place in American democracy where there's an assault on some of the features of reality that one would have thought, just a couple years ago, were beyond debate, discussion, or argument."
  • Page 12 A new cold war began, one based on targeted misinformation, and within months Facebook CEO Mark Zuckerberg was sitting before Congress explaining how Russian trolls were seeding news feeds with weaponized clickbait, not so much to misinform but to encourage the sort of dead- end arguing that makes democratic collaboration difficult.
  • Page 12 epistemic chaos: "Is Truth Dead?"
  • Page 13 Inside this new information ecosystem where everyone had access to facts that seemed to confirm their views, we began to believe we were living in separate realities.
  • Page 15 the Leadership LAB. On most Saturdays, the LAB heads out with a rotating but loyal group of volunteers to talk with people at their front doors. After doing this for more than a decade and having more than fifteen thousand conversations, most recorded so they could pore over each exchange to improve their rhetoric, the LAB had slowly honed a method so fast and reliable, so new, that social scientists began buying plane tickets to study it in person.
  • Page 15 They call it deep canvassing.
  • Page 16 Their mission for years, they told me, was the "long game": to change minds about LGBTQ issues by developing best practices for shifting public opinion, and then sharing what they learned about how to do that so they could help win elections and ballot measures around the world. The goal, they explained, was to alter policies and change laws in places where prejudice and opposition to LGBTQ issues still flourished.
  • Page 21 In the parts of Los Angeles County where they had been crushed by two to one or more, Fleischer's team spoke with every voter who answered and found that not only were people willing, they were eager to discuss the recent vote and LGBTQ issues in general. They wanted to be heard and, in some cases, forgiven. So they offered justifications for their behavior.
  • Page 22 People's explanations for voting against same-sex marriage clustered around three values: tradition, religion, and the protection of their children.
  • Page 22 As time passed, justifications mentioning children faded away, leaving behind only tradition and religion.
  • Page 23 Their values were in conflict between protecting their children and protecting the rights of others.
  • Page 23 They held both positive and negative attitudes about same-sex marriage, and if they were ambivalent, that meant they might be open to reconsidering their vote.
  • Page 25 they emphasized something they called "radical hospitality," a form of selfless concern and energetic friendliness akin to what you might experience at a family reunion. From the moment volunteers arrived at
  • Page 25 a training until they hugged and waved goodbye, the team and the veteran volunteers treated each person as if the day just got better because he or she or they showed up. Radical hospitality is so important to the process that Laura often tells veterans and staff to take breaks if they feel like they can't maintain a joyous enthusiasm.
  • Page 26 Once engaged, people tended to emerge, adamant and confident, ready to defend themselves.
  • Page 26 Canvassers asked where people first heard about the issue at hand. Most quickly realized it was received wisdom--
  • Page 26 Then the canvasser asked if they knew anyone affected by the issue.
  • Page 26 By the end, their own opinions seemed alien.
  • Page 26 the canvasser asked questions and listened, paraphrasing and reflecting back her words.
  • Page 28 Often, it seemed as if the people who changed their minds during these conversations didn't even realize it. They talked themselves into a new position so smoothly that they were unable to see that their opinions had flipped. At the end of the conversation, when the canvassers asked how they now felt, they expressed frustration, as if the canvasser hadn't been paying close enough attention to what they'd been saying all along.
  • Page 29 "There is no superior argument, no piece of information that we can offer, that is going to change their mind," he said, taking a long pause before continuing. "The only way they are going to change their mind is by changing their own mind--by talking themselves through their own thinking, by processing things they've never thought about before, things from their own life that are going to help them see things differently."
  • Page 30 He said to keep that image in mind while standing in front of someone, to remember to spend as little time as possible talking about yourself, just enough to show that you are friendly, that you aren't selling anything. Show you are genuinely interested in what they have to say.
  • Page 30 said, keeps them from assuming a defensive position.
  • Page 30 it's their story that should take up most of the conversation. You want them to think about their own thinking.
  • Page 30 Once that real, lived memory was out in the open, you could (if done correctly) steer the conversation away from the world of conclusions with their facts googled for support, away from ideological abstractions and into the world of concrete details from that individual's personal experiences. It was there, and only there, he said, that a single conversation could change someone's mind.
  • Page 31 Steve explained that after thousands of recorded conversations they had found that battling over differing interpretations of the evidence kept the people they met from exploring why they felt so strongly one way or the other.
  • Page 32 "What I envision when I'm standing in front of a voter is that people have this intellectual, logical reasoning process. That's one part of how they process the world and make decisions. But they have this almost entirely separate emotional reasoning process which is based on feelings and things they've experienced."
  • Page 33 Deep canvassing is about gaining access to that emotional space, Steve explained, to "help them unload some baggage," because that's where mind change happens.
  • Page 35 Steve asked, on abortion rights, where she saw herself on a scale of zero to ten, zero being a belief that there should be no legal access to abortion in any way, and ten being support for complete, full, easy access.
  • Page 35 Then he asked Martha why that number felt right to her.
  • Page 35 allow a person's justifications to remain unchallenged.
  • Page 35 nod and listen.
  • Page 35 The idea is to move forward, make the person feel heard and respected, avoid arguing over a person's conclusions, and instead work to discover the motivations behind them. To that end, the next step is to evoke a person's emotional response to the issue.
  • Page 35 After evoking negative emotions like this, canvassers ask people if their opinion has changed, and they re-ask them where they are on the scale of zero to ten. Sampling their newly salient feelings, people often move a few numbers.
  • Page 36 If she had moved, he would have asked her why. But since she didn't, he asked her what the video made her think.
  • Page 36 Instead of arguing, the canvasser listens, helping the voter untangle their thoughts by asking questions and reflecting back their answers to make certain they are hearing them correctly. If people feel heard, they further articulate their opinions and often begin to question them.
  • Page 36 As people explain themselves, they begin to produce fresh insights into why they feel one way or another.
  • Page 36 Instead of defending, they begin contemplating, and once a person is contemplating, they often produce their own counterarguments, and a newfound ambivalence washes over them. If enough counterarguments stack up, the balance may tip in favor of change.
  • Page 36 if he could evoke a memory from her own life that contradicted the reasoning she had shared, she might notice the conflict without him having to point it out.
  • Page 36 She'd be challenging herself.
  • Page 37 what Steve had been looking for--a real, lived experience, one that was especially laden with emotion.
  • Page 38 In the training they called this "modeling vulnerability," and the idea was that if you open up, so will they.
  • Page 39 she had discovered she was conflicted. She would notice things she didn't before. She had moved from neutral to somewhat supportive, and that counted as change.
  • Page 44 one in ten people opposed to transgender rights changed their views, and on average, they changed that view by 10 points on a 101- point "feelings thermometer,"
  • Page 44 If one in ten doesn't sound like much, you're neither a politician nor a political scientist. It is huge.
  • Page 46 Altogether, Broockman and Kalla found that deep canvassing was 102 times more effective than traditional canvassing, television, radio, direct mail, and phone banking combined.
  • Page 47 consistency bias: our tendency, when uncertain, to assume our present self has always held the opinions it holds today.
  • Page 49 the brain often gets things wrong because it prefers to sacrifice accuracy for speed.
  • Page 49 Without a chance to introspect, we remain overconfident in our understanding of the issues about which we are most passionate. That overconfidence translates to certainty, and we use that certainty to support extreme views.
  • Page 50 When asked to provide opinions on health-care reform, a flat tax, carbon emissions, and so on, many subjects held extreme views. When experimenters asked people to provide reasons for their opinions, they did so with ease. But if asked to explain those issues in mechanistic detail, they became flustered and realized they knew far less about the policies than they thought they did. As a result, their opinions became less extreme.
  • Page 51 people rarely considered the other side's perspective until asked to do so.
  • Page 51 By empathizing, even hypothetically, people softened their positions--something subjects could have done at any time but, until prompted, never considered.
  • Page 52 "Well, it's funny, in a way it's not new at all. We did not invent the concept that one human being can talk with another human being," he told me, laughing. "So in a way, there's nothing original here at all, and yet it is very original, because it is so much against the grain of the dominant political culture." 3. Socks and Crocs
  • Page 56 How is that thing, whatever it is that we call a mind, made in the first place?
  • Page 56 Asking how we make up our minds, and then do or do not change them, is not that distant from asking, What is the very nature of consciousness itself?--a question that may not even have an answer, at least not yet, not in the confines of our current scientific understanding, nor the language we use to communicate it.
  • Page 57 drama that divided the planet." The Dress was a meme, a viral photo that appeared all across social media for a few months. For some, when they looked at this photo, they saw a dress that appeared black and blue. For others, the dress appeared white and gold. Whatever people saw, it was impossible to see it differently. If not for the social aspect of social media, you might have never known that some people did see it differently. But since social media is social, learning the fact that millions saw a different dress than you did created a widespread, visceral response. The people who saw a different The Dress seemed clearly, obviously mistaken and quite possibly deranged.
  • Page 59 For many, it was an introduction to something neuroscience has understood for a long while, which is also the main subject of this chapter: the fact that reality itself, as we experience it, isn't a perfect one-to-one account of the world around us.
  • Page 59 The world, as you experience it, is a simulation running inside your skull, a waking dream. We each live in a virtual landscape of perpetual imagination and self-generated illusion, a hallucination informed over our lifetimes by our senses and thoughts about them, updated continuously as we bring in new experiences via those senses and think new thoughts about what we have sensed. If you didn't know this, for many The Dress demanded you either take to your keyboard to shout into the abyss or take a seat and ponder your place in the grand scheme of things.
  • Page 60 Because no organism can perceive the totality of objective reality, each animal likely assumes that what it can perceive is all that can be perceived. Objective reality, whatever it is, can never be fully experienced by any one creature.
  • Page 60 The extension of this idea is that if different animals live in different realities, then maybe different people live in different realities, too.
  • Page 61 So this idea that subjective reality and objective reality are not the same, that what we experience inside our minds is a representation of the outside world, a model and not a replica, has been brewing among people who think about thinking for a very long time, but Uexküll brought it into a new academic silo--biology.
  • Page 64 For brains, everything is noise at first. Then brains notice the patterns in the static, and they move up a level, noticing patterns in how those patterns interact. Then they move up another level, noticing patterns in how sets of interacting patterns interact with other sets, and on and on it goes. Layers of pattern recognition built on top of simpler layers become a rough understanding of what to expect from the world around us, and their interactions become our sense of cause and effect.
  • Page 64 We start our lives awash in unpredictable chaos, but the regularity of our perceptions becomes the expectations we use to turn that chaos into predictable order.
  • Page 65 What research like this demonstrates is that each and every brain enters the world trapped in a dark vault of a skull, unable to witness firsthand what is happening outside. Thanks to brain plasticity, through repeated experience, when inputs are regular and repeating, neurons quickly get burned into the reciprocal patterns of activation. It creates a unique predictive model in each individual nervous system, a sort of bespoke resting potential for those same networks to light up in the same way in similar circumstances.
  • Page 65 As Bertrand Russell put it, "The observer, when he seems to himself to be observing a stone, is really, if physics is to be believed, observing the effects of the stone upon himself."
  • Page 66 The spectrum of light we can see-- the primary colors we call red, green, and blue-- are specific wavelengths of electromagnetic energy. These wavelengths of energy emanate from some source, like the sun, a lamp, a candle. When that light collides with, say, a lemon, the lemon absorbs some of those wavelengths and the rest bounce off. Whatever is left behind goes through a hole in our heads called the pupil and strikes the retinas at the back of the eyes where it all gets translated into the electrochemical buzz of neurons that the brain then uses to construct the subjective experience of seeing colors. Because most natural light is red, green, and blue combined and a lemon absorbs the blue wavelengths, it leaves behind the red and green to hit our retinas, which the brain then combines into the subjective experience of seeing a yellow lemon. The color, though, exists only in the mind. In consciousness, yellow is a figment of the imagination. The reason we tend to agree that lemons are yellow (and lemons) is because all our brains pretty much create the same figment of the imagination when light hits lemons and then bounces into our heads.
  • Page 70 in situations of what Pascal and Karlovich call "substantial uncertainty," the brain will use its experience to create illusions of what ought to be there but isn't. In other words, in novel situations the brain usually sees what it expects to see.
  • Page 72 Pascal's lab came up with a term for this. They call it SURFPAD. When you combine Substantial Uncertainty with Ramified (which means branching) or Forked Priors or Assumptions, you will get Disagreement.
  • Page 72 when the truth is uncertain, our brains resolve that uncertainty without our knowledge by creating the most likely reality they can imagine based on our prior experiences.
  • Page 73 When we encounter novel information that seems ambiguous, we unknowingly disambiguate it based on what we've experienced in the past. But starting at the level of perception, different life experiences can lead to very different disambiguations, and thus very different subjective realities. When that happens in the presence of substantial uncertainty, we may vehemently disagree over reality itself--but since no one on either side is aware of the brain processes leading up to that disagreement, it makes the people who see things differently seem, in a word, wrong.
  • Page 77 happening. If you illuminate pink Crocs in only green light, they will appear gray.
  • Page 79 Pascal was feverish about the implications. Neither side was right nor wrong, so arguing for only one side or the other wouldn't arrive at a deeper understanding: that objective reality and subjective realities can differ. Only the two truths combined, the combination of shared perspectives, would alert people there was a deeper truth, and only through conversation would they have any hope of solving the mystery.
  • Page 82 When faced with uncertainty, we often don't notice we are uncertain, and when we attempt to resolve that uncertainty, we don't just fall back on our different perceptual priors; we reach for them, motivated by identity and belonging needs, social costs, issues of trust and reputation, and so on.
  • Page 83 Disagreements like these often turn into disagreements between groups because people with broadly similar experiences and motivations tend to disambiguate in broadly similar ways, and whether they find one another online or in person, the fact that trusted peers see things their way can feel like all the proof they need: they are right and the other side is wrong factually, morally, or otherwise.
  • Page 83 Since subjectivity feels like objectivity, naive realism makes it seem as though the way to change people's minds is to show them the facts that support your view, because anyone else who has read the things you have read or seen the things you have seen will naturally see things your way, given that they've pondered the matter as thoughtfully as you have.
  • Page 83 Therefore, you assume that anyone who disagrees with your conclusions probably just doesn't have all the facts yet. If they did, they'd already be seeing the world like you do. This is why you continue to ineffectually copy and paste links from all our most trusted sources when arguing your points with those who seem misguided, crazy, uninformed, and just plain wrong. The problem is that this is exactly what the other side thinks will work on you.
  • Page 84 Blaise Pascal,
  • Page 84 Pensées.
  • Page 85 "People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others."
  • Page 86 Pascal and Karlovich's research suggests that simply presenting challenging evidence is not enough. We must meet in ways that allow us to ask and understand how people arrived at their conclusions.
  • Page 87 We must admit if we had experienced what others have, we might even agree with them.
  • Page 87 "cognitive empathy": an understanding that what others experience as the truth arrives in their minds unconsciously, so arguments over conclusions are often a waste of time.
  • Page 87 The better path, they said, would be for both parties to focus on their processing, on how and why they see what they see, not what.
  • Page 87 was planning on spending some time with former members of cults, hate groups, and conspiracy theory communities. Based on what I'd read, people often leave groups like those not because their beliefs are directly challenged, but because something totally outside of the ideology causes them to see it differently. 4. Disequilibrium
  • Page 94 As psychologist Michael Rousell told me, when experiences don't match our expectations, a spike in dopamine lasting about a millisecond motivates us to stop whatever we were doing and pay attention. After the surprise, we become motivated to learn from the new experience so we can be less wrong in the future.
  • Page 95 surprises encourage us to update our behaviors.
  • Page 95 They change our minds without us noticing as the brain quietly updates our predictive schemas, hopefully eliminating the surprise by making it more predictable in the future.
  • Page 95 Our minds are always changing and updating, writing and editing. And thanks to this plasticity, so much of what we consider real and unreal, true and untrue, good and bad, moral and immoral, changes as we learn things we didn't know we didn't know.
  • Page 96 In philosophy, the idea of "knowing" something doesn't mean believing that you know something.
  • Page 96 It means knowing something that also happens to be true.
  • Page 96 belief. To philosophers, beliefs and knowledge are separate, because you can believe things that are false.
  • Page 97 an epistemology is: a framework for sorting out what is true.
  • Page 98 In the end, epistemology is about translating evidence into confidence.
  • Page 98 But some ways of sorting out what the hell is going on are better than others, depending on what it is you want to know.
  • Page 98 Thankfully for us, when it comes to the empirical truth, the epistemology called science seems to have won out, since it is the only one that can build iPhones and vaccines.
  • Page 100 to paraphrase the Pulitzer Prize–winning science writer Kathryn Schulz, until we know we are wrong, being wrong feels exactly like being right.
  • Page 100 Since the brain doesn't know what it doesn't know, when it constructs causal narratives it fills holes in reality with provisional explanations. The problem is that when a group of brains all uses the same placeholder, good-enough-for-now construal to plug such a hole, over time that shared provisional explanation can turn into consensus--a common sense of what is and is not true. This tendency has led to a lot of strange shared beliefs over the centuries, consensus realities that today seem preposterous. For instance, for a very long time most people believed that geese grew on trees.
  • Page 105 When we first suspect we may be wrong, when expectations don't match experience, we feel viscerally uncomfortable and resist accommodation by trying to apply our current models of reality to the situation. It's only when the brain accepts that its existing models will never resolve the incongruences that it updates the model itself by creating a new layer of abstraction to accommodate the novelty. The result is an epiphany, and like all epiphanies it is the conscious realization that our minds have changed that startles us, not the change itself.
  • Page 106 Kuhn wrote that "novelty emerges only with difficulty, manifested by resistance, against a background provided by expectation." In other words, when we don't know what we don't know, at first we see only what we expect to see, even when what we see doesn't match our expectations. When we get that "I might be wrong" feeling, we initially try to explain it away, interpreting novelty as confirmation, looking for evidence that our models are still correct, creating narratives that justify holding on to our preconceived notions. Unless grandly subverted, our models must fail us a few times before we begin to accommodate.
  • Page 107 Kuhn was suggesting that when we update, it isn't the evidence that changes, but our interpretation of it.
  • Page 110 When a person's core expectations are massively subverted in a way that makes steady change impossible, they may experience intense, inescapable psychological trauma that results in the collapse of the entire model of reality they once used to make sense of the world.
  • Page 110 Some go down a maladaptive spiral... However, most people intuitively and immediately go searching among friends, family, and the internet for new information, new perspectives, raw material for rebuilding themselves.
  • Page 110 "posttraumatic growth."
  • Page 111 in "the frightening and confusing aftermath of trauma, where fundamental assumptions are severely challenged," people must update "their understanding of the world and their place in it." If they don't, the brain goes into a panic, unable to make sense of reality. The resolution of that panic necessitates new behavior, new thoughts, new beliefs, and a new self-concept.
  • Page 111 after losing a child, after a crushing divorce, after surviving a car accident or a war or a heart attack,people routinely report that the inescapable negative circumstances they endured left them better people. They shed a slew of outdated assumptions that, until the trauma, they never had any reason to question, and thus never knew were wrong. People report that it feels like unexplored spaces inside their minds have opened up, ready to be filled with new knowledge derived from new experiences.
  • Page 112 Anything reduced to rubble won't be rebuilt in the same, unreliable way again.... The result is a new worldview that is "far more resistant to being shattered." In crisis, we become radically open to changing our minds.
  • Page 113 Posttraumatic growth is the rapid mind change that comes to a person after a sudden, far-reaching challenge to the accuracy of their assumptive world. When our assumptions completely fail us, the brain enters a state of epistemic emergency. To move forward, to regain a sense of control and certainty, you realize some of your knowledge, beliefs, and attitudes must change, but you aren't sure which.
  • Page 116 Unless otherwise motivated, the brain prefers to assimilate, to incorporate new information into its prior understanding of the world. In other words, the solution to "I might be wrong" is often "but I'm probably not."... To orient ourselves properly, we update carefully. So if novel information requires us to update our beliefs, attitudes, or values, we experience cognitive dissonance until we either change our minds or change our interpretations.
  • Page 117 So just how much cognitive dissonance does it take for a person to switch from assimilation to accommodation? Is there a quantifiable point at which the brain realizes its models are incorrect or incomplete and switches from conservation to active learning? Could we put a number on it?
  • Page 119 Assimilation, they discovered, has a natural upper limit.... For most, he said, the tipping point came when 30 percent of the incoming information was incongruent. ... Some people may need a bit more disconfirmation than others. Also, some people may be in a situation where disconfirmation is unlikely, cut off from challenging ideas, curating an information feed that stays below the threshold.
  • Page 120 the important point isn't the specific number found in this one study, just that there is a number, a quantifiable level of doubt when we admit we are likely wrong and become compelled to update our beliefs, attitudes, and values. Before we reach that level, incongruences make us feel more certain, not less.
  • Page 121 they didn't expect to see that kind of card, and thus they also couldn't see them. Once they did see them, they tried to make them fit into their old model, the one where those kinds of cards didn't exist. Only when that model failed to make sense of what they were experiencing did they feel compelled to accommodate, to change their minds.... It's dangerous to be wrong, but it's also dangerous to be ignorant, so if new information suggests our models might be incorrect or incomplete, we first attempt to fit the anomalies into our old understanding. If they do, we continue using those models until they fail us too many times to ignore.
  • Page 132 [Westboro Church defector] He felt overwhelmed by a torrent of information that he once considered noise. All at once, he began to feel an intense uncertainty not only about what was true, but about who he was. ... Zach reiterated that he didn't leave the church because he changed his opinions; he changed his opinions because he left the church. ... And he left the church because it had become intolerable for other reasons.
  • Page 139 about assimilation and accommodation, how we first try to make novel and challenging information fit into our worldviews until those times when we realize we must update our worldviews to make room for them.
  • Page 146 It was all of it together, each an anomaly that alone could have been assimilated, novel information that created a mounting cognitive dissonance that at one point in her life could have been assuaged by interpreting it as confirmation of her worldview in some way, but taken together it felt like overwhelming disconfirmation.
  • Page 151 Contact with the world of the sinners was never forbidden, but the nature of their contact was tightly controlled, and much of the time that contact was hostile and antagonistic.
  • Page 154 it was the loss of a sense of community that prompted them to leave. ... Still, even when they felt their first doubts, it took others, people on the outside who listened and showed them counterarguments wrapped in kindness, to truly pull them away. ... they couldn't leave their worldviews behind until they felt like there was a community on the outside that would welcome them into theirs.
  • Page 155 For him, every pattern held a fascination. At every turn, he seemed on the alert for hidden meanings, for how the mundane fit into a bigger system of ideas and agendas.
  • Page 156 he had never been part of a stable community who took him seriously until the truthers welcomed him.
  • Page 157 He had no tribe. Then in 2006, Charlie watched a video in which Alex Jones explained how 9/11 was an inside job. Intrigued, he began to spend a lot of time online watching videos that made arguments like Jones's. Soon he was part of group discussions. And eventually, part of the groups themselves. ... "You're looking for a scapegoat; your life is meaningless; you're just a little nobody, but then suddenly you feel like you're part of an elite. You know things."
  • Page 159 In 2016, cognitive neuroscientists Sarah Gimbel, Sam Harris, and Jonas Kaplan identified a group of subjects who held strong opinions by asking them to mark on a scale from one to seven how strongly they believed in a variety of statements, some political, some neutral.
  • Page 159 When a person was challenged about political wedge issues like abortion or welfare or gun control, the scanner showed that their brains went into fight-or-flight mode, causing their bodies to pump adrenaline, stiffening the muscles and moving blood out of the nonessential organs. As Gimbel told me, "The response in the brain that we see is very similar to what would happen if, say, you were walking through the forest and came across a bear."
  • Page 160 "Remember that the brain's first and primary job is to protect our selves," Kaplan told me. "That extends beyond our physical self, to our psychological self. Once these things [beliefs, attitudes, and values] become part of our psychological self, they are then afforded all the same protections that the brain gives to the body."
  • Page 161 The famous Solomon Asch experiment in which people denied the truth of their own eyes when surrounded by actors who claimed a short line and a long line printed on a large card were the same length. ... A third of subjects bowed to social pressure and said they agreed, though later they said they internally felt at odds with the group. It also led to the Stanley Milgram experiments into obedience in which experimenters successfully goaded two thirds of subjects, who believed they were delivering electric shocks to strangers, to crank the electricity up to lethal doses.
  • Page 162 He studied prejudice throughout the 1950s, and at the time the assumption across most of psychology was that animosity between groups was based on aggressive personalities rising to power and influencing others.
  • Page 163 What he discovered was that there is no baseline. Any difference, of any kind, would activate our innate us-versus-them psychology.
  • Page 164 once people become an us, we begin to loathe a them, so much so that we are willing to sacrifice the greater good if it means we can shift the balance in our group's favor. ... the research, in both psychology and neuroscience, suggests that because our identities have so much to do with group loyalty, the very word itself, identity, is best thought of as that which identifies us as, well, us--but more importantly not them. ... Humans aren't just social animals; we are ultra-social animals. We are the kind of primate that survives by forming and maintaining groups. Much of our innate psychology is all about grouping up and then nurturing that group--working to curate cohesion. If the group survives, we survive. ... So a lot of our drives, our motivations, like shame, embarrassment, ostracism, and so on, have more to do with keeping the group strong than keeping any one member, including ourselves, healthy. In other words, we are willing to sacrifice ourselves and others for the group, if it comes to that.
  • Page 165 humans value being good members of their groups much more than they value being right, so much so that as long as the group satisfies those needs, we will choose to be wrong if it keeps us in good standing with our peers.
  • Page 166 In times of great conflict, where groups are in close contact with each other, or communicating with each other a lot, individuals will work extra hard to identify themselves to each other as us and not them.
  • Page 166 Any opinion, he said, can become fused with group identity.
  • Page 168 crackpot. His credentials, of course, never changed. The research into tribal psychology is clear. If a scientific, fact-based issue is considered neutral--volcanoes or quasars or fruit bats--people ... tend to trust what an expert has to say. But once tribal loyalties are introduced, the issue becomes debatable. ... The average person will never be in a position where beliefs on gun control or climate change or the death penalty will affect their daily lives. The only useful reason to hold any sort of beliefs on those issues, to argue about them, or share them with others is to "convey group allegiance," ... Your values seem out of alignment with the group, "you could really suffer serious material and emotional harm," explained Kahan.
  • Page 169 it is impossible to know or evaluate everything. The world is too vast, too complex, and ever-changing. So a hefty portion of our beliefs and attitudes are based on received wisdom from trusted peers and authorities. Whether in a video, within a textbook, behind a news desk, or standing at a pulpit, for that which we can't prove ourselves, it is in their expertise we place our faith. ... These reference groups are where we get our knowledge about Saturn's moons and the nutritional value of granola, what happens after we die and how much money Argentina owes China. They also influence our attitudes about everything from jazz trombones to nuclear power and the healing power of aloe vera. We consider what they tell us to be true, the prevailing attitude among them to be reasonable, because we trust they have vetted the information. We trust them because we identify with them. They share our values and our anxieties. They seem like us, or they seem like the people we would like to be. ... Once we consider a reference group trustworthy, questioning any of their accepted beliefs or attitudes questions all of them, and this can be a problem. Humans are primates, and primates are gregarious creatures.
  • Page 170 Scientists, doctors, and academics are not immune. But lucky for them, in their tribes, openness to change and a willingness to question one's beliefs or to pick apart those of others also signals one's loyalty to the group. Their belonging goals are met by pursuing accuracy goals. For groups like truthers, the pursuit of belonging only narrowly overlaps with the pursuit of accuracy, because anything that questions dogma threatens excommunication.
  • Page 171 Conspiratorial thinking becomes most resistant to change once a person becomes bound to a group identity as a conspiracy theorist. After that, a threat to the beliefs becomes a threat to the self, and the psychological mechanisms that bind us together as groups take over; those are what prevent the metacognition necessary to escape.
  • Page 172 when we are fearful, we are constantly attempting to reduce the chaos and complexity of an uncertain world into something manageable and tangible, something we can fight, like the work of a small group of malevolent puppet masters. At our most anxious, we give the side eye to governments and institutions and political parties--to the groups that we feel are not our own--not just a few nearby individuals.
  • Page 172 If conspiracy theorists discover any disconfirmatory evidence, then they may conclude it was planted by the conspirators to throw them off the trail.
  • Page 176 When we feel as though accepting certain facts could damage our reputation, could get us ostracized or excommunicated, we become highly resistant to updating our priors. But the threat to our reputation can be lessened either by affirming a separate group identity or reminding ourselves of our deepest values. ... Subjects who got a chance to affirm they were good people were much more likely to compromise and reach an agreement with their ideological opponent than people who felt their reputations were at stake.
  • Page 177 oppositional identity: he saw himself as a subversive, an underdog who opposed the status quo
  • Page 178 Conspiracy theorists and fringe groups may hold individually coherent theories, but there is no true consensus, just the assumption of consensus. If they hung out together, they might catch on to that, but since they rarely do, they can each keep their individual theories and still assume they have the backing of a tribe. They never get a chance to argue face-to-face, so there is no evolution of ideas, no central theory strengthened by constant challenge and defense.
  • Page 179 If the brain assumes the risks of being wrong outweigh any potential rewards for changing its mind, we favor assimilation over accommodation, and most of the time that serves us well.
  • Page 180 "I've always been looking for my tribe, and something started happening in my brain on that 9/11 trip," Charlie said. "Meeting all these people. I started to see that perhaps the tribe that welcomed me so much were not mentally healthy people."
  • Page 182 psychology, this is called the introspection illusion. Decades of research had shown that though we often feel very confident that we know the antecedents of our own thoughts, feelings, and behaviors, along with the sources of our motivations and goals, we are rarely privy to such information. Instead, we observe our own behavior and contemplate our own thoughts the way an observer would another person, and then we create rationalizations and justifications for what we think, feel, and believe.
  • Page 185 The subjects in the Jane study started with the same evidence, but when differently motivated by different questions, they generated different arguments for different conclusions.
  • Page 185 Basically, when motivated to find supporting evidence, that's all we look for. When we desire to find a reason for A over B, we find it.
  • Page 189 epistemic vigilance.
  • Page 189 In an information exchange, epistemic vigilance helps protect individuals from updating too hastily. Without the order afforded by rough consensus, social situations would become unnavigable, and the behaviors that usually put food in your belly and keep your blood in your body might fail in both regards. By avoiding bad information, even from people you typically trust, brains and groups maintain their vital cohesion.
  • Page 192 Reasoning is often confused with reason, the philosophical concept of human intellect and rationality.
  • Page 192 In short, reasoning is coming up with arguments--plausible justifications for what you think, feel, and believe--and plausible means that which you intuit your trusted peers will accept as reason-able.
  • Page 194 Research shows people are incredibly good at picking apart other people's reasons. We are just terrible at picking apart our own in the same way.
  • Page 198 If we do debate, we tend to fall prey to what legal scholar Cass Sunstein calls the "law of group polarization," which says that groups who form because of shared attitudes tend to become more adamant and polarized over time. This is because when we wish to see ourselves as centrists but learn that others in our group take a much more extreme position, we realize that to take the middle position, we must shift our attitude in the direction of the extreme.
  • Page 198 In response, people who wish to take extreme positions must shift further in that direction to distance themselves from the center. This comparison-to-others feedback loop causes the group as a whole to become more polarized over time, and as consensus builds, individuals become less likely to contradict it.
  • Page 200 Sure, the internet makes it easier to form groups around our biased and lazy reasoning; but it also exposes us to the arguments of those outside of our groups. Spend enough time in places like Reddit or Twitter or Facebook, with all the arguing and all the bad ideas fighting one another, and even if you remain silent, someone will voice something that resembles your private opinion, and someone will argue with them. Even as spectators, we can realize when the weaknesses of our justifications have been exposed.
  • Page 200 For Stafford, this means if we can create better online environments, ones designed to increase the odds of productive arguments instead of helping us avoid arguing altogether, we may look back on this period of epistemic chaos as a challenge we overcame with science.
  • Page 203 psychology defines beliefs as propositions we consider to be true. The more confidence you feel, the more you intuit that a piece of information corresponds with the truth. The less confidence, the more you consider a piece of information to be a myth. ... Attitudes, however, are a spectrum of evaluations, feelings going from positive to negative that arise when we think about, well, anything really. We estimate the value or worth of anything we can categorize, and we do so based on the positive or negative emotions that arise when that attitude object is salient. Those emotions then cause us to feel attracted or repulsed by those attitude objects, and thus influence our motivations. Most importantly, attitudes are multivalent. We express them as likes or dislikes, approval or disapproval, or ambivalence when we feel both.
  • Page 204 Taken together, beliefs and attitudes form our values, the hierarchy of ideas, problems, and goals we consider most important. ... The realization that opinions were more influenced by attitudes than beliefs revealed an unexplored territory.
  • Page 205 Much of the research had been based on an idea put forward by a sociologist and political scientist named Harold Lasswell. He said that all communications between humans could be broken down to: "Who says what to whom in which channel and with what effect?" Who referred to the communicator. What referred to the message. To whom referred to the audience. In which channel referred to the medium or the context. With what effect referred to the impact the message had on the audience.
  • Page 206 Petty and Cacioppo realized the reason it didn't make sense was because there were two higher-level variables at play. ... Petty and Cacioppo used the terms "high elaboration" and "low elaboration" to describe these two kinds of thinking.
  • Page 207 the elaboration likelihood model is that persuasion isn't only about learning the information. Elaboration is contextualizing the message after it gets inside your head, something more akin to how people arrive at different interpretations of inkblots in a Rorschach test. ... information alone won't be sufficient to persuade some people. Individuals vary in how they assimilate such concepts into their existing models. ... the same message that persuaded one person would discourage another. ... Motivating factors that increase likelihood include not only relevance but incentives to reach accurate conclusions, a feeling of responsibility to make sense of the message's claims, and a personality trait called "high need for cognition."
  • Page 208 When elaboration likelihood is high, people tend to take what Petty and Cacioppo called the "central route"; but as likelihood drops off, people tend to move onto what they called the "peripheral route."
  • Page 208 On the central route, the merits of the message matter. On the peripheral route, the merits are ignored and people focus on simple, emotional cues.
  • Page 209 Petty and Cacioppo found that the more motivated the students, the more they took the central route. On that route, they paid more attention, and so the stronger arguments were more persuasive.
  • Page 210 Research has found that successful attitude change via the central route may take more effort, but it also creates more enduring attitudes. Messages that persuade via the peripheral route tend to do so quickly and easily, which is great for making a sale or getting people to go vote, but the changes they produce are weak. They fade with time and can be reverted with minimal effort. ... So which route should we encourage people to take? That depends. Vodka, for instance, is colorless, odorless, and mostly tasteless. There's no great distinction between brands (until the next morning). With something like that, we would be correct to encourage people to take the peripheral route. It would be better for a vodka company to focus on interesting packaging, celebrity endorsements, and ad campaigns that play up the luxury, prestige, or playfulness of the brand. To make up for the fact that the peripheral route doesn't lead to long-lasting change, they would need to continually deliver emotional appeals and routinely change out the presentation of the messages. Advertising can accomplish this with a constant stream of rotating celebrities, slogans, logos, and so on.
  • Page 211 However, if we are trying to change attitudes about complex fact-based issues like immigration or health care or nuclear power, we need to know our audience. What motivates them? Are they knowledgeable? Are they distracted in some way? For facts to work, we need to move them onto the central route and keep them there. If we know they are already motivated and knowledgeable about the topic, most of the work is done for us. If not, facts must be delivered by a trusted source in a setting where people are amenable to learning new information.
  • Page 212 In the late 1980s, Shelly Chaiken and Alice H. Eagly introduced the heuristic-systematic model (HSM). It posits that when lazily thinking about alternative ways of feeling about the world, we use heuristics or simple rules of thumb that mostly show we are right. When thinking effortfully, we systematically process information considering all the ways we might be wrong. ... Most of the time when there's a handy heuristic available, the HSM says we will fall back on it. Brains are cognitive misers, as psychologists like to say.
  • Page 213 Certainty gives an air of objectivity to the subjective. When a person says, ‘This is the best movie of 2019.' They mean it. That feels like a fact to them." ... "You receive a billion messages a day from advertisements, politics, social media, and so on," said Luttrell. "You can't engage with all of them, but some will affect you, and how they affect you is different when you are invested or can dig through the evidence. Both models have the insight that, at the end of the day, it depends on how deeply the audience is engaging with the message. And that hadn't been considered until these models."
  • Page 213 [It is] important to sort out a person's values and motivations.
  • Page 225 The aim is no longer to attempt to change people's minds. The goal is to help people arrive at a more rigorous way of thinking, a better way of reaching certainty or doubt. What a person believes is no longer the point of the conversation, but why and how they believe those things and not others.
  • Page 225 He'd later clarify, as so often is the case, that the phrase "change your mind" can mean many things. What he meant was that he doesn't set out in a conversation to change people's conclusions about what is and is not true, or moral, or important. Regardless, that's usually what happens if someone takes the time to go through all the steps with him.
  • Page 226 About Street Epistemology... HERE ARE THE STEPS: Establish rapport. Assure the other person you aren't out to shame them, and then ask for consent to explore their reasoning. Ask for a claim. Confirm the claim by repeating it back in your own words. Ask if you've done a good job summarizing. Repeat until they are satisfied. Clarify their definitions. Use those definitions, not yours. Ask for a numerical measure of confidence in their claim. Ask what reasons they have to hold that level of confidence. Ask what method they've used to judge the quality of their reasons. Focus on that method for the rest of the conversation. Listen, summarize, repeat. Wrap up and wish them well.
  • Page 233 Anthony emphasized that street epistemology is about improving people's methods for arriving at confidence, not about persuading someone to believe one thing more than another.
  • Page 234 He reiterated to stay honest. Ask outright, "With your consent, I would like to investigate together the reasoning behind your claims, and perhaps challenge it so that it either gets stronger or weaker—because the goal here is for both of us to walk away with better understandings of ourselves," or something like that. And if that isn't your goal, it won't work. You can't fake it.
  • Page 234 You're guiding them through their reasoning so that they can understand it. "That's it. It's surprising how that is really it."
  • Page 234 Watching Anthony work and then listening to him explain the method, I couldn't help but notice that street epistemology and deep canvassing seemed incredibly similar in many ways.
  • Page 236 First, ask a nonthreatening question that's open-ended. Something like, "I've been reading a lot about vaccines lately, have you seen any of that?" Next, just listen for a while. Then communicate your curiosity and establish rapport by asking a nonjudgmental follow-up question. Next, reflect and paraphrase. Summarize what you've heard so far to make the other person feel heard and respected. Then look for common ground in the person's values. You might not agree with their argument, but you can communicate that you too have values like theirs, fears and anxieties, concerns and goals like they do. You just think the best way to deal with those issues is slightly different. Then share a personal narrative about your values to further connect. Finally, if your views have changed over time, share how.
  • Page 237 Tamerius said she thought it might be, and that it seemed to her that everyone was pulling from the same kinds of lessons that therapists had learned over the last fifty years dealing with people resistant to change.
  • Page 244 We don't decide or choose to be certain or uncertain; we just feel it. ... The vast majority of what the brain does happens "beneath thought, and then it's projected into consciousness."
  • Page 245 Beliefs and doubts are better thought of as processes, not possessions. They aren't like marbles in a jar, books on a shelf, or files in a computer. Belief and doubt are the result of neurons in associative networks delivering an emergent sensation of confidence or the lack thereof.
  • Page 251 "I want to live in a world where people believe true things. But I've realized that ridicule, being angry and telling people that they're mistaken, is not going to help them. We're all sort of in the same boat. We're just grasping for reasons to justify the views that we've already built. Once you know that, you begin to feel empathy, you really do. You begin to have epistemic humility about what you yourself believe."
  • Page 263 The more people who grow up within, or eventually obtain, physical and economic security will always develop values of individuality, autonomy, and self-expression.
  • Page 271 Psychologist Gordon Allport... outlined its principles in his landmark 1954 book The Nature of Prejudice. ... Allport spent years researching prejudice, and in his book said that before minds can change concerning members of a minority or an out-group, they must make true contact. First, members must meet, especially at work, under conditions of equal status. Second, they must share common goals. Third, they should routinely cooperate to meet those goals. Fourth, they must engage in informal interactions, meeting one another outside of mandated or official contexts, like at one another's homes or at public events. And finally, for prejudice to truly die out, the concerns of the oppressed must be recognized and addressed by an authority, ideally the one that writes laws.
  • Page 273 The creation of new conceptual categories is the greatest sign that accommodation is occurring on a large scale, and thus social change is imminent. ...For instance, the term designated driver was invented by the Harvard Alcohol Project as a public health initiative and then seeded into popular television shows like Cheers and L.A. Law. ... If you accepted the term and used it, then it created dissonance with the urge to drink and drive. ... To resolve the dissonance, the existing model had to be updated—people who drink do not drive.
  • Page 275 One of the curious aspects of moving from one paradigm to another is that the moment a better explanation comes along that can accommodate the anomalies that the previous paradigm couldn't assimilate, the anomalies simply become facts. We rearrange our categories, create new ones, and fill them with refined definitions.
  • Page 276 Researchers say, in short, it was about trust—we don't live in a post-truth world, but a post-trust world. A general distrust of media, science, medicine, and government makes a person very unlikely to get vaccinated no matter how much information you throw at them, especially when the people they do trust share their attitudes.
  • Page 277 So the research suggests that to shift hesitant attitudes about vaccines or anything else, we must identify who is hesitant, what institutions they most trust, and then distribute the vaccine from the manifestations of those institutions that will appeal to the most socially connected groups within that population. Coda
  • Page 290 Whatever model you subscribe to as a flat-Earther, the binding idea is that there is a mysterious powerful them who at some point learned the Earth was flat—either through seeing it from space or from exploring to the farthest edges of the disc—and now they are covering it up for some reason.
  • Page 291 Beneath that dogma is a value that expresses itself in attitudes. Flat-Earthers don't distrust the scientific method, just the institutions that use it; so they often use the scientific method to test their hunches. When they perform experiments and the results suggest that their hypothesis is incorrect or provides evidence for a competing hypothesis, they dismiss that evidence as anomaly.
  • Page 291 Science is smarter than scientists, and the method is what delivers results over time. But for it to work, you must be willing to say you are wrong. And if your reputation, your livelihood, your place in your community are at stake, well, that can be hard to do.
  • Page 292 when interacting with someone who is vaccine-hesitant, you'll get much further if you frame it as respectful collaboration toward a shared goal, based on mutual fears and anxieties, and demonstrate you are open to their perspective and input on the best course of action.