Skip to Content

Book Summary: How Minds Change – The Surprising Science of Belief, Opinion, and Persuasion

How Minds Change (2022) is a deep dive into why we believe, why we keep believing, and why, sometimes, we stop believing. More than that, it’s a guide to changing minds – not through manipulation or coercion, but through empathy and open-mindedness.

Introduction: Learn why minds stay stuck in their ways – and how you can change them.

What makes a conspiracy theorist stop believing that 9/11 was a hoax? How does a former cult member decide to leave their old beliefs behind? What causes someone to go from strictly opposing same-sex marriage to being strongly in favor?

Book Summary: How Minds Change - The Surprising Science of Belief, Opinion, and Persuasion

Often, these kinds of mindset shifts can seem like random, maybe even lucky, confluences of events. After all, how many times have you tried to change someone’s mind about something and gotten absolutely nowhere?

Despite how it sometimes seems, there’s a pattern behind when and how people change their minds. Thanks to neuroscientific and sociological research, that pattern is now much more visible – and we can use it to change even the most stubborn of minds.

In this summary, you’ll learn

  • why people hold so tightly to their beliefs;
  • how a conversation can turn the tide of an election; and
  • how to get someone to really scrutinize their reasoning.

Even the most stubborn of believers can ultimately change their minds.

In June 2011, five British conspiracy theorists boarded a flight in London bound for New York City. They were accompanied by a TV crew responsible for creating the BBC series Conspiracy Road Trip. In each episode of the show, a different part of the conspiracy community travels somewhere in the world. There, they meet experts and eyewitnesses who challenge their beliefs with facts and evidence. The goal is to get them to have changed their minds by the end of the episode.

This particular episode focused on five “truthers” – people who believe that the official narrative of what happened on 9/11 is a lie. They traveled to New York, Virginia, and Pennsylvania, where they met experts in explosives, demolition, air travel, and construction, in addition to family members of victims, government officials, and architects. They also trained in a commercial airline simulator and took flying lessons that let them soar over New York City.

How many of them do you think ultimately changed their minds? On every other episode of Conspiracy Road Trip, the number was exactly zero. But this time, it was one.

The person in question was Charlie Veitch, a prominent thought leader within the truther community. At the time, Charlie was famous for his YouTube conspiracy videos – some with over a million views – and for his regular practice of hitting the streets with a megaphone, trying to recruit people into the movement. He had befriended and collaborated with well-known conspiracy theorists Alex Jones and David Icke.

However, something changed for Charlie during the filming of the episode. His certainty in his beliefs began to be eroded by his encounters with demolition experts, seeing the blueprints of the Twin Towers, and attending the flight school. His final epiphany occurred when meeting two people, Alice Hoagland and Tom Heidenberger, who had lost family members during the attacks. As he described it, the change in his opinion was like a sudden “bang!”

When the truthers reconvened later in the episode, none of them were on the same page as Charlie. They argued that Hoagland – one of the family members that Charlie had met – had either been brainwashed by the FBI or was a paid actress. They were firmly caught in the conspiratorial loop – a kind of logical prison in which people claim that any contradictory evidence is purposely designed and planted by the conspirators to mask the truth.

How, then, was Charlie able to break free of the loop? Was it the strength of the evidence alone that convinced him? It couldn’t have been – otherwise the other truthers on the trip would have been equally convinced. In fact, there was something else going on in Charlie’s life that set the stage for him to change his mind. To understand why, we’ll need to talk about why people begin to hold tightly to a set of beliefs in the first place.

People hold on to their beliefs because it feels psychologically safe.

What happens inside the brain when a person’s deeply held beliefs are challenged? In 2016, a group of neuroscientists – Sarah Gimbel, Sam Harris, and Jonas Kaplan – decided to find out.

First, they gathered together a group of participants who held strong opinions about a variety of topics. They placed them inside MRI machines and then presented counterarguments to their professed beliefs. Some of the beliefs the researchers rebutted were neutral. For example, some subjects strongly believed that Thomas Edison invented the lightbulb. Those subjects read that the lightbulb had in fact been “invented 70 years before Edison.” Other beliefs the researchers rebutted were of a political nature. Say a participant strongly believed in strict gun control. That participant would have been shown a statement like “Ten times more people are murdered with kitchen knives each year than are killed by assault weapons.”

After reading the counterarguments, subjects read their original belief statements again. Then, the researchers asked them to rank their feelings on a scale from one to seven. For the neutral statements, participants’ beliefs softened when exposed to the counterarguments. But when it came to the politically charged topics, their brains responded to counterarguments as if they were literal, physical threats. They signaled their bodies to release adrenaline, which stiffened their muscles and caused blood to rush away from their nonessential organs. In essence, the participants reacted the same way they would if they’d been walking through a forest and came across a bear.

But why? Why does the brain respond like that? Well, because its job is to protect us – in both the physical and psychological sense. Once our beliefs and attitudes are adopted as part of the psychological self, the brain protects them as if they were parts of the physical body.

It does that because our brains are wired to discriminate against out-groups and toward in-groups – more simply, against a “them” and toward an “us.” This isn’t actually so illogical. After all, humans survive by forming and maintaining groups, so much of our psychology is devoted to doing just that.

As a result, we value being good members of our groups more than we value being strictly, factually correct. As long as a group continues to satisfy our needs, we’d rather be technically wrong than risk our standing with our peers. In the words of sociologist Brooke Harrington, “Social death is more frightening than physical death.”

So, now we can understand people’s reactions during the MRI experiment a bit better. Some ideas – like whether or not Thomas Edison really invented the lightbulb – aren’t part of our group identity. Thus, when those ideas are challenged, we don’t feel physically threatened. However, when it comes to political beliefs, we cease to reason as individuals and instead reason as members of a group. By stalwartly maintaining our beliefs, we seem trustworthy to our peers. And that, to us, feels like safety.

However, just because we identify ourselves with a particular group at one point in time doesn’t mean we’re bound to consider them trustworthy forever. Once we feel that it’s our own peers who have become untrustworthy or ceased to satisfy our safety needs, we’ll unconsciously attempt to change them through argumentation. If that form of searching for connection doesn’t work, we begin to search outside our original group.

This is exactly the reason why Charlie Veitch was ultimately able to change his mind and leave the truther community, and why, for example, people eventually feel safe enough to leave cults. It wasn’t the facts alone that persuaded Charlie. He had merely become open to the facts because he had found another community, called Truth Juice, which was more in line with his values than the old one. Truth Juice groups meet across the UK to listen to New Age, transhumanist, occult, spirituality, and conspiracy theorist speakers deliver lectures on a wide range of topics. Charlie had met and entered into a relationship with a woman at one of these gatherings, and he’d slowly begun to assimilate more with them than with his old community of truthers. So when he participated in the Conspiracy Road Trip, he was already in a state where he felt safe and comfortable changing his mind.

Face-to-face discussions can change a mind in 20 minutes.

We humans usually feel very confident that we know and understand the reasons behind our thoughts, feelings, beliefs, motivations, and goals. We create a sort of biography for ourselves – one that tells us we’re reasonable people who reach conclusions by viewing and contemplating the evidence before us. If other people were as smart as us, we say to ourselves, they would surely reach the same conclusions.

But that’s not how it works. Instead, it’s much more like we’re the observers of our own behavior. We create rationalizations, justifications, and stories about what we think, feel, and believe after we already believe it.

Ever tried to change someone’s mind by patiently – or perhaps impatiently – unfurling a list of facts and evidence and totally failed to make them budge? Well, this is exactly why. People make their decisions on a visceral level first and then apply the logical reasoning process second.

Steve Deline and his organization, the Leadership LAB, learned this through experience. For more than a decade, Steve and his fellow volunteers have been going out almost every Saturday to speak to people at their front doors. They’ve had over 15,000 conversations to date, and most of them have been recorded for later review.

Steve and his crew call their method deep canvassing, and through it, they’re often able to get a person to give up a long-held opinion in less than 20 minutes. The method is focused on open, honest, vulnerable conversations where judgment is carefully reserved. Because the LAB is the political action arm of the Los Angeles LGBT Center, their work focuses on changing voters’ opinions on issues like same-sex marriage and combatting homophobia and transphobia.

During deep canvassing training, volunteers learn several techniques that they can use in their conversations. One example is called “modeling vulnerability” – sharing a mistake you made in the past or something that was difficult for you in order to encourage the other person to do the same.

One of the most important things deep canvassers learn is to never argue with or challenge another person’s claims. That just causes the conversation to spiral into an unwinnable fight, and that isn’t the point at all. The point is to listen to the other person and make them feel heard and respected.

The efficacy of deep canvassing is still being studied, but the results are extremely promising so far. One study, performed by political scientists David Broockman and Joshua Kalla, found that a single deep canvassing conversation caused one in ten people opposed to transgender rights to change their views. Ten percent might not sound like much to the average person. But to politicians and political scientists, that’s absolutely enormous – especially after a single conversation. According to Kalla, a mind change of much less than that could rewrite laws, win a swing state, or decide an entire election. Moreover, the effects of deep canvassing appear to last a long time. The people who changed their minds showed no signs of backtracking to their former attitudes. That’s extremely rare in political science research.

So,⁠ is it possible to make use of deep canvassing techniques in everyday situations involving friends, family, or acquaintances? That’s what we’re about to find out.

Use conversations to help people think better.

Before you get too excited about the potential of deep canvassing, we need to add a disclaimer. While it is possible to make use of some deep canvassing principles in your everyday life, people who participate in the Leadership LAB all undergo fairly extensive training before they hit the streets.

Fortunately, deep canvassing has a lot in common with other methods of changing people’s minds and helping them think better. The one we’re going to explore here is street epistemology. It’s called that because the person who developed the practice, Anthony Magnabosco, started by stopping random people on the street and getting them to question their epistemology – basically, how they know what they know or, more accurately, think they know.

Street epistemology is less goal-oriented than deep canvassing. Deep canvassing is done by people with a particular perspective, who want to convince other people to join their side of a particular argument. Street epistemology, on the other hand, is more focused on getting people to question the fact claims they believe – for example, that God is real, that the Earth is flat, or that vaccines cause autism. It investigates the reasons people have for believing these claims and whether those reasons are actually good ones. Almost any claim can be explored using the methods of street epistemology.

The way it works seems simple but can be devilishly difficult to get right in practice. There are nine steps, and we’re going to go through all of them now.

The first step is to establish rapport. Ask the other person for consent to examine their beliefs. Take some time to ask about their day, about what’s going on in their lives. Don’t be overzealous and immediately jump into the topic you want to discuss. Make sure that the other person feels heard and that you’re going to listen to what they have to say. If they feel safe, they’ll be more willing to open up.

The next step is to ask them for a claim. We’ve already mentioned a few examples. Street epistemology works best on fact-based claims, but it’s also possible to use it on more attitude-based claims – even something as mundane as “strawberry ice cream is better than vanilla ice cream.” It also works on values-based claims, like “Americans’ tax dollars should go toward forgiving student loans instead of buying more aircraft carriers.”

After the other person has given you a claim, step three is to confirm that you understood it. Don’t restate their claim in their exact words, but do reflect what they said back to them.

Next, clarify the definitions of any terms involved in what you’ll be discussing. For example, some people might use the term “the government” to talk about a collection of evil billionaires plotting to take advantage of normal citizens. You, on the other hand, might see “the government” as a collection of civil servants trying to improve the state of the nation for everybody. Identifying what each of you means by the terms you’re using avoids the problem of talking past one another. During your discussion, use their meaning for each term, not yours.

After this, step five is to identify a confidence level. Ask your conversation partner to label the confidence they have in their claim from 0 to 100. This allows them to take a step back and assess just how sure they are about their own feelings.

It’s an easy segue from here to step six, which is identifying how they arrived at their confidence level. If they say they’re at 80 percent confidence, you can ask them, “Why not 100?” They might offer several reasons, but try to settle on what might be the common factor uniting all of them.

Next, it’s time for the most important step, step seven: asking what method they used to judge the quality of their reasons. There are tons of ways of going about this, but the point is essentially to get people to assess the reliability of their ways of knowing or believing. What method did they use to arrive at their conclusions? Could someone else have plausibly used the same method and arrived at a completely different conclusion? For example, say you’re speaking with someone who believes in the theory that the Earth is flat rather than a globe. You could ask them, “What would you say is the biggest reason why you think the Earth is flat today?” and “What reason would most lower your confidence in the flat Earth theory?” Your goal here is to move the person away from the claim itself and help them see which factors are influencing or underlying their reasoning. How important is the evidence to them, and could other evidence cause them to change their mind? Remember, you’re not actually giving them any counterevidence here – just helping them explore what kinds of counterevidence would change their mind.

Once you’ve explored their judgment sufficiently, you can move on to the penultimate step, which is to listen, summarize, and repeat. Reflect the person’s answers back to them by paraphrasing their arguments. Then thank them for their time, and encourage them to keep thinking about their thinking. Share your own beliefs, if you like, and offer to explore them in the same way you explored theirs.

Finally, for step nine, simply suggest that you two continue the conversation later.

Remember that throughout all of this, your goal is not to copy and paste your own reasoning about a particular issue onto another person. It’s simply to get them to think about their own thoughts in a way that they usually don’t. All you’re really doing is guiding them through their own reasoning. But try it, and you’ll see how remarkable the effects can be – and how many minds end up changed as a result.


Changing a person’s mind can seem like an insurmountable task. After all, the human brain is highly motivated to stick to its existing beliefs; that way, it continues to appear trustworthy to members of its chosen in-group, which is key to its survival. However, it is possible to change someone’s mind by having an empathetic, face-to-face conversation. The key is to uncover the real reasons why someone believes what they believe rather than try to persuade them with a barrage of facts and evidence.

But before you embark on a mission to change someone’s mind, you should first ask yourself, “Why?” Why is changing that person’s mind important to you? After determining your answer, share it with that person. This will help you establish a mutual understanding of your interests. Often, you’ll find yourself clashing with the person on the specific positions each of you takes, but you’ll have the same underlying interest. This way, the conversation becomes a collaboration rather than a conflict.


Science, Politics, Social Science, Business, Self Help, Personal Development, Cultural, Management, Leadership, Decision-Making and Problem Solving, Cognitive Psychology Personal Transformation

About the author

David McRaney is an author, journalist, lecturer, and the creator of the blog You Are Not So Smart, which became an internationally bestselling book, later followed by You Are Now Less Dumb. David currently hosts the popular You Are Not So Smart podcast and speaks internationally about irrational thinking and delusion. Before finding internet fame, David graduated with a degree in journalism from the University of Southern Mississippi and cut his teeth covering Hurricane Katrina on the Gulf Coast and in the Pine Belt region of the Deep South. He has been a beat reporter, editor, photographer, voice-over artist, television host, digital content manager, and everything in between.

Video and Podcast

    Ads Blocker Image Powered by Code Help Pro

    Your Support Matters...

    We run an independent site that\'s committed to delivering valuable content, but it comes with its challenges. Many of our readers use ad blockers, causing our advertising revenue to decline. Unlike some websites, we haven\'t implemented paywalls to restrict access. Your support can make a significant difference. If you find this website useful and choose to support us, it would greatly secure our future. We appreciate your help. If you\'re currently using an ad blocker, please consider disabling it for our site. Thank you for your understanding and support.