- The book is a guide to reasoning that teaches readers how to improve their thinking skills and avoid common biases and fallacies, based on the latest research in cognitive psychology.
- The book covers the basics of reasoning, such as logic, probability, and statistics, the sources of thinking problems, such as heuristics, overconfidence, confirmation bias, and causal attribution, and the solutions to thinking problems, such as self-control, perspective taking, planning, and learning.
- The book shows how better reasoning can help us live better by achieving our goals, overcoming challenges, coping with emotions, and making better decisions. It also emphasizes the importance of being aware of our biases, seeking evidence, considering alternatives, and revising our beliefs.
Thinking 101 (2022) asserts that by understanding and overcoming thinking biases, we can better solve or even avoid most problems, from everyday conflicts to larger societal issues.
Table of Contents
- Introduction: Improve your life – and society – by overcoming common biases in thinking.
- Our minds overestimate our abilities to do things that seem easy.
- We tend to go with what we think we know without considering all possibilities.
- We prefer examples and stories at the expense of more rational, statistical data.
- We care too much about negative facts and fear losing ownership.
- We shape new facts to fit what we already know.
- We rarely understand perspectives outside of our own.
- We prefer to settle for less now instead of waiting for more later.
- Summary
- About the author
- Genres
- Table of Contents
- Review
Introduction: Improve your life – and society – by overcoming common biases in thinking.
One of Yale University’s most popular undergraduate courses is simply titled Thinking. The course explores common thinking pitfalls and, better yet, how people can overcome them to make better-informed decisions to improve their own lives and society as a whole.
You don’t have to attend Yale to unlock this knowledge. Professor Woo-kyoung Ahn, who developed and teaches the course, makes the top takeaways available to everyone in Thinking 101.
This summary explains common types of thinking biases and how they can affect our decision-making, including evidence for how they’re often at the root of problems ranging from poor financial decisions to societal prejudice. You’ll also discover solutions for overcoming each of them.
Our minds overestimate our abilities to do things that seem easy.
Have you ever watched a YouTube video for a recipe, makeup tutorial, or home repair that seemed simple until you tried it and had to file it away as a failure? Woo-kyoung Ahn runs a similar experiment with her students: They watch a six-second dance routine eleven times plus a slower instructional video. Then, they can volunteer to do the dance with the promise of prizes for doing it successfully. There’s no shortage of volunteer dancers, yet no one nails it.
Why? Because fluency, meaning how easily our brains process new information, can fuel overconfidence, decision-making, and outcomes.
While fluency informs metacognition – which is the critical process by which we judge situations to determine the next steps – we can’t rely on it entirely to ensure good outcomes.
Thankfully, there’s a pretty simple way to overcome the fluency effect: you can practice new things, like rehearsing a speech or interview responses. Of course, there are situations when you don’t get a trial run, like tackling a home renovation project. In those cases, you can plan, but you should be aware that studies show people also tend to be overly confident and optimistic about planning. To counter that, add padding to your initial estimate of what it will take to accomplish your goal, whether that’s time, money, effort, or a combination. The author recommends adding 50 percent to your initial estimate. So for example, if you think you can meet a deadline in two days, tell your boss to expect it in three.
We tend to go with what we think we know without considering all possibilities.
Imagine you’re given three numbers in a sequence and told that the sequence follows a rule that you must determine. You then have to give another set of three numbers that follow what you believe that rule to be.
All clear? Good. Then let’s try this one out.
The numbers you’re given are 2-4-6.
What’s your series of numbers, and why?
When Ahn poses this question to her students, many give the same answers as participants in the famous experiment conducted in 1960 by cognitive psychologist Peter Wason. They say “4-6-8,” which indeed follows the rule. Then, they assert that the rule is “even numbers increasing by two.”
But it’s not.
Once they know this, guesses become more complicated and the students become more exasperated until finally, they figure out the rule guiding the series is simply “any increasing numbers.”
This example illustrates the concept of confirmation bias and how it can block our ability to solve problems – sometimes with simpler solutions.
So how can you overcome confirmation bias? You could, for example, come up with two mutually exclusive hypotheses and work equally to confirm both. There also are small ways to try it in your day-to-day life, like taking a different route to work, ordering a dish you’ve never tried next time you order takeout, or letting a friend pick out something for you while shopping. You may end up with a more pleasant commute, a new favorite dish, or maybe a sweater you never wear, but your mind will be more open than before.
We prefer examples and stories at the expense of more rational, statistical data.
Imagine if you’d enrolled your young child in ice-skating lessons and then saw little to no skill improvement over three years. Around the same time you started the skating classes, you also let them give soccer a go. Then at a match, you noticed your child was actively running away from the ball when it was kicked toward them.
You’d probably think, “My child is just not into sports.” The author certainly did when she experienced these situations with her son. Yet in high school, he discovered cross-country running and became captain of the team.
It wasn’t that he disliked all sports, just the two his mother chose when he was young. Ahn explains that she came to a faulty conclusion based solely on just two examples when, in reality, there’s a world of different sports out there. She uses the story – and yes, it is anecdotal – to illustrate the law of large numbers and how it can help better inform our decisions.
Researchers have long asserted that storytelling through anecdotes and examples is powerful because it appeals to our senses and is therefore more relatable than abstract concepts. For example, when the Centers for Disease Control and Prevention rolled out a campaign featuring testimonials by former smokers who’d experienced life-altering conditions, data soon followed showing a 12 percent increase in people attempting to quit. These types of campaigns are consistently shown to be more effective than abstract approaches used in the past, such as warning labels on tobacco product packaging.
The challenge is when data overwhelmingly points to conclusions that conflict with specific examples. In such cases, we need to override our sensual preferences and look to statistics to make rational decisions.
To balance your natural reactivity to anecdotes, you need to become more comfortable with data science. Ahn explains that people aren’t comfortable with statistics for many reasons, first because it’s just not a regular thing to work with the types of large numbers and demographics that make up study samples. Even the concept of using probabilities in reasoning is relatively new to humanity – it wasn’t documented until the 1560s. And while most statistical concepts are learnable, they’re just not easy to mentally summon for day-to-day situations.
But some data science concepts are simpler than you may think, like the law of large numbers we referenced earlier. It simply means that the more data there is, the better it is for decision-making. If you consider any data available to you instead of relying solely on a single story, you’re more likely to arrive at a logical conclusion.
We care too much about negative facts and fear losing ownership.
Many studies support the idea that people give more mental real estate to negative events than positive ones. Others show how our fear of losing something can keep us from even considering possible gains, at least up to a point. Even the emotional attachment of ownership clouds our judgment in what is known as the endowment effect.
In one study, researchers gave a group of people a choice between a mug or a chocolate bar as a gift, and the group split about fifty-fifty in which they chose. A second group received a mug each as a gift to keep. They then had the option of swapping the mug for a chocolate bar – only 11 percent took that offer. You might wonder if there was something special about the mug over the chocolate bar, but when the gift was reversed for a second group, only 10 percent wanted to exchange their chocolate for the mug. In the latter two groups, an overwhelming number of people didn’t want to part with something simply because they considered it to be theirs.
This and other types of negativity bias can lead us astray in making the best choices to start with. On top of that, we have difficulty getting rid of things that aren’t serving us. Thankfully, there are ways to make this duality work to our benefit. We can positively reframe options before us, like considering the 90 percent chance of surviving necessary surgery or choosing a flight with an 88 percent on-time arrival rate. On the flip side, we should equally scrutinize when a salesperson frames a potential sale by removing options versus adding to a baseline. In the case of overcoming perceived ownership, we need to be more watchful about offers for free trials of a service. Remember not to let believing it’s yours sway your decision to keep or cancel. Think instead of whether you’ll use it enough to justify the cost.
We shape new facts to fit what we already know.
Biased interpretation is the root of the confirmation bias we discussed previously. It’s when we rely on our existing beliefs so much that we even take new, conflicting data and shape it to fit the story we’re telling ourselves instead of being objective.
For example, when the author was pregnant with her first child, she happened to read an article in Nature about how babies exposed to any light while sleeping were found to be five times more likely to develop nearsightedness. With that, she crossed night-light off her nursery shopping list.
A year later, Nature reported that the study had come to faulty conclusions. Researchers had failed to consider that the parents of the babies who developed nearsightedness had the condition themselves. Because of that, the parents were more likely to have night-lights in their children’s nurseries. Also, the condition of nearsightedness can be passed on genetically, which was more likely the cause of the babies becoming nearsighted than exposure to night-lights as they slept.
Even in light of this new information, by the time she had her second child, Ahn still refused to use a night-light. Nearsighted herself, the first study’s erroneous correlation between night-lights and infant nearsightedness had stuck in her mind, even when the follow-up study all but debunked it.
Overcoming biased interpretation is tougher than for most other cognitive biases because it’s part of our top-down processing – the subconscious mental framework we use to take in new information. Cognitive behavioral therapy has been shown to be effective yet rigorous. Additional broader solutions to counter possibly biased interpretation include simply being aware of it and how it can cause significant problems at a societal level, such as having long-held prejudices against people who differ from ourselves. This kind of awareness can lead to changing systemic policies and regulations for the betterment of society.
We rarely understand perspectives outside of our own.
The research on how poorly people pick up nuances in tone may frighten you. In one study, friends paired off, with each writing a series of single-sentence emails which they then sent to one another. Some of the sentences were sarcastic. Others were serious. The recipients then had to determine which were which. The results? Their perceptions were accurate only half the time. While these sets of friends perceived the messages accurately when delivered verbally, other studies on ambiguity in verbal communication show significant confusion, even among people who knew one another well.
We have misunderstandings all the time, even with the best intentions and with people we know well. That’s because, despite a desire to see others’ perspectives, research shows we’re really bad at it. When we assume too much familiarity in our communications, things quickly go awry.
While considering and caring about others’ perspectives is a critical first step to understanding, the only surefire ways to get it right are as simple as they sound. Be clear about your own thoughts, even if it means adding an emoji to a text or overstating yourself verbally. With others, don’t try to mind read, guess, or assume, no matter how well you think you know them. Instead, just ask.
We prefer to settle for less now instead of waiting for more later.
If someone offered you $340 now or $350 in six months, which would you choose? If you’re like most people, you’ll go for the $340 now. What if we up that offer to $390 in six months versus $340 now? Even then, most people would go for the immediate reward instead of the promise of $50 more later.
These are typical tests and results that show how we so easily blow off delayed gratification, even in instances where it makes more sense rationally to wait. Take the second question, where many people justify taking the lower amount now by reasoning they could invest the money to make more than the $50 increase. Or, they say some major event between now and six months could prevent them from getting the money. But no market investment in a normal economy has a higher rate of return than the promised increase over that time, and most other events also are unlikely. Yet research proves we still cling to that reasoning, at least when the differences are relatively small.
We struggle with delayed gratification for three reasons. To overcome these we need to consider each of them and take them on individually.
One is simply our lack of self-control. Studies show that one of the most effective ways to resist temptation is to find a useful distraction.
Another is our difficulty sorting through uncertainty, often delaying one decision until some unknown thing is resolved. Be sure that in those instances, the unknown is truly conditional on the outcome. For example, you may take a vacation for different reasons depending on whether you pass or fail a test. If you are going to take the vacation regardless, take advantage of planning ahead.
Last, since it’s difficult to feel future experiences, there’s a disconnect in the present when making decisions for our future selves. To address that, set goals, remind yourself of them often, and imagine how they’ll impact your life for the better.
Summary
You need to be aware of your thinking biases and employ simple techniques to counter them. As a consequence, you’ll make better decisions, which will lead not only to an improvement in your own life but also in society as a whole. Use the strategies outlined in this summary to be fairer to yourself, but also to others through fostering understanding and cooperation.
WOO-KYOUNG AHN is the John Hay Whitney Professor of Psychology at Yale University. After receiving her Ph.D. in psychology from the University of Illinois, Urbana-Champaign, she was assistant professor at Yale University and associate professor at Vanderbilt University. In 2022, she received Yale’s Lex Hixon Prize for teaching excellence in the social sciences. Her research on thinking biases has been funded by the National Institutes of Health, and she is a fellow of the American Psychological Association and the Association for Psychological Science. Thinking 101 is her first book.
Genres
Personal Development, Psychology, Self Help, Science, Philosophy, How To, Neuroscience, Brain, Sociology, Management, Leadership, Business Decision Making, Decision-Making and Problem Solving, Cognitive Psychology
Table of Contents
Introduction 1
1 The Allure of Fluency: Why Things Look So Easy 7
2 Confirmation Bias: How We Can Go Wrong When Trying to Be Right 37
3 The Challenge of Causal Attribution: Why We Shouldn’t Be So Sure When We Give Credit or Assign Blame 75
4 The Perils of Examples: What We Miss When We Rely on Anecdotes 105
5 Negativity Bias: How Our Fear of Loss Can Lead Us Astray 137
6 Biased Interpretation: Why We Fail to See Things As They Are 163
7 The Dangers of Perspective-Taking: Why Others Don’t Always Get What’s Obvious to Us 193
8 The Trouble with Delayed Gratification: How Our Present Self Misunderstands Our Future Self 223
Epilogue 253
Acknowledgments 257
Notes 261
Index 271
Review
The book is a guide to reasoning that aims to help readers improve their thinking skills and avoid common biases and fallacies. The author, Woo-Kyoung Ahn, is a professor of psychology at Yale University who teaches a popular course called “Thinking”. She draws on her own research and the findings of other cognitive psychologists to explain how we think, why we make mistakes, and how we can overcome them.
The book is divided into three parts: Part I covers the basics of reasoning, such as logic, probability, and statistics. Part II explores the sources of thinking problems, such as heuristics, overconfidence, confirmation bias, and causal attribution. Part III offers solutions to thinking problems, such as self-control, perspective taking, planning, and learning.
Each chapter begins with a question or a scenario that illustrates a thinking problem, followed by an explanation of the relevant concepts and research. The author uses examples from pop culture, history, current events, and her own life to make the book engaging and relatable. She also provides exercises and quizzes for readers to test their understanding and apply their skills.
I enjoyed reading this book and found it very informative and useful. The author has a clear and concise writing style that makes complex ideas easy to grasp. She also has a good sense of humor that adds some fun to the book. I liked how she used real-world examples and stories to illustrate her points and show how thinking problems affect our lives. I learned a lot about how I think and how I can improve my reasoning.
The book is not only a guide to reasoning, but also a guide to living better. The author shows how thinking problems can lead to personal and social issues, such as depression, anxiety, prejudice, conflict, and injustice. She also shows how better reasoning can help us achieve our goals, overcome challenges, cope with emotions, and make better decisions. She emphasizes the importance of being aware of our biases, seeking evidence, considering alternatives, and revising our beliefs.
The book is suitable for anyone who wants to think better and live better. It is not only for students or academics, but also for professionals, parents, teachers, leaders, and citizens. It is a valuable resource for anyone who wants to learn more about themselves and the world around them.
I highly recommend this book to anyone who is interested in reasoning, psychology, or self-improvement. It is a must-read for anyone who wants to be more rational, critical, and creative. It is one of the best books on thinking that I have ever read.
Summary: Thinking 101: How to Reason Better to Live Better by Woo-Kyoung Ahn