Skip to Content

Book Summary: Thinking 101 – How to Reason Better to Live Better

Thinking 101 (2022) asserts that by understanding and overcoming thinking biases, we can better solve or even avoid most problems, from everyday conflicts to larger societal issues.

Book Summary: Thinking 101 - How to Reason Better to Live Better

Content Summary

Genres
Introduction: Improve your life – and society – by overcoming common biases in thinking.
Our minds overestimate our abilities to do things that seem easy.
We tend to go with what we think we know without considering all possibilities.
We prefer examples and stories at the expense of more rational, statistical data.
We care too much about negative facts and fear losing ownership.
We shape new facts to fit what we already know.
We rarely understand perspectives outside of our own.
We prefer to settle for less now instead of waiting for more later.
Summary
About the author
Table of Contents
Overview
Review/Endorsements/Praise/Award
Video and Podcast
Read an Excerpt/PDF Preview

Genres

Personal Development, Psychology, Self Help, Science, Philosophy, How To, Neuroscience, Brain, Sociology, Management, Leadership, Business Decision Making, Decision-Making and Problem Solving, Cognitive Psychology

Introduction: Improve your life – and society – by overcoming common biases in thinking.

One of Yale University’s most popular undergraduate courses is simply titled Thinking. The course explores common thinking pitfalls and, better yet, how people can overcome them to make better-informed decisions to improve their own lives and society as a whole.

You don’t have to attend Yale to unlock this knowledge. Professor Woo-kyoung Ahn, who developed and teaches the course, makes the top takeaways available to everyone in Thinking 101.

This summary explains common types of thinking biases and how they can affect our decision-making, including evidence for how they’re often at the root of problems ranging from poor financial decisions to societal prejudice. You’ll also discover solutions for overcoming each of them.

Our minds overestimate our abilities to do things that seem easy.

Have you ever watched a YouTube video for a recipe, makeup tutorial, or home repair that seemed simple until you tried it and had to file it away as a failure? Woo-kyoung Ahn runs a similar experiment with her students: They watch a six-second dance routine eleven times plus a slower instructional video. Then, they can volunteer to do the dance with the promise of prizes for doing it successfully. There’s no shortage of volunteer dancers, yet no one nails it.

Why? Because fluency, meaning how easily our brains process new information, can fuel overconfidence, decision-making, and outcomes.

While fluency informs metacognition – which is the critical process by which we judge situations to determine the next steps – we can’t rely on it entirely to ensure good outcomes.

Thankfully, there’s a pretty simple way to overcome the fluency effect: you can practice new things, like rehearsing a speech or interview responses. Of course, there are situations when you don’t get a trial run, like tackling a home renovation project. In those cases, you can plan, but you should be aware that studies show people also tend to be overly confident and optimistic about planning. To counter that, add padding to your initial estimate of what it will take to accomplish your goal, whether that’s time, money, effort, or a combination. The author recommends adding 50 percent to your initial estimate. So for example, if you think you can meet a deadline in two days, tell your boss to expect it in three.

We tend to go with what we think we know without considering all possibilities.

Imagine you’re given three numbers in a sequence and told that the sequence follows a rule that you must determine. You then have to give another set of three numbers that follow what you believe that rule to be.

All clear? Good. Then let’s try this one out.

The numbers you’re given are 2-4-6.

What’s your series of numbers, and why?

When Ahn poses this question to her students, many give the same answers as participants in the famous experiment conducted in 1960 by cognitive psychologist Peter Wason. They say “4-6-8,” which indeed follows the rule. Then, they assert that the rule is “even numbers increasing by two.”

But it’s not.

Once they know this, guesses become more complicated and the students become more exasperated until finally, they figure out the rule guiding the series is simply “any increasing numbers.”

This example illustrates the concept of confirmation bias and how it can block our ability to solve problems – sometimes with simpler solutions.

So how can you overcome confirmation bias? You could, for example, come up with two mutually exclusive hypotheses and work equally to confirm both. There also are small ways to try it in your day-to-day life, like taking a different route to work, ordering a dish you’ve never tried next time you order takeout, or letting a friend pick out something for you while shopping. You may end up with a more pleasant commute, a new favorite dish, or maybe a sweater you never wear, but your mind will be more open than before.

We prefer examples and stories at the expense of more rational, statistical data.

Imagine if you’d enrolled your young child in ice-skating lessons and then saw little to no skill improvement over three years. Around the same time you started the skating classes, you also let them give soccer a go. Then at a match, you noticed your child was actively running away from the ball when it was kicked toward them.

You’d probably think, “My child is just not into sports.” The author certainly did when she experienced these situations with her son. Yet in high school, he discovered cross-country running and became captain of the team.

It wasn’t that he disliked all sports, just the two his mother chose when he was young. Ahn explains that she came to a faulty conclusion based solely on just two examples when, in reality, there’s a world of different sports out there. She uses the story – and yes, it is anecdotal – to illustrate the law of large numbers and how it can help better inform our decisions.

Researchers have long asserted that storytelling through anecdotes and examples is powerful because it appeals to our senses and is therefore more relatable than abstract concepts. For example, when the Centers for Disease Control and Prevention rolled out a campaign featuring testimonials by former smokers who’d experienced life-altering conditions, data soon followed showing a 12 percent increase in people attempting to quit. These types of campaigns are consistently shown to be more effective than abstract approaches used in the past, such as warning labels on tobacco product packaging.

The challenge is when data overwhelmingly points to conclusions that conflict with specific examples. In such cases, we need to override our sensual preferences and look to statistics to make rational decisions.

To balance your natural reactivity to anecdotes, you need to become more comfortable with data science. Ahn explains that people aren’t comfortable with statistics for many reasons, first because it’s just not a regular thing to work with the types of large numbers and demographics that make up study samples. Even the concept of using probabilities in reasoning is relatively new to humanity – it wasn’t documented until the 1560s. And while most statistical concepts are learnable, they’re just not easy to mentally summon for day-to-day situations.

But some data science concepts are simpler than you may think, like the law of large numbers we referenced earlier. It simply means that the more data there is, the better it is for decision-making. If you consider any data available to you instead of relying solely on a single story, you’re more likely to arrive at a logical conclusion.

We care too much about negative facts and fear losing ownership.

Many studies support the idea that people give more mental real estate to negative events than positive ones. Others show how our fear of losing something can keep us from even considering possible gains, at least up to a point. Even the emotional attachment of ownership clouds our judgment in what is known as the endowment effect.

In one study, researchers gave a group of people a choice between a mug or a chocolate bar as a gift, and the group split about fifty-fifty in which they chose. A second group received a mug each as a gift to keep. They then had the option of swapping the mug for a chocolate bar – only 11 percent took that offer. You might wonder if there was something special about the mug over the chocolate bar, but when the gift was reversed for a second group, only 10 percent wanted to exchange their chocolate for the mug. In the latter two groups, an overwhelming number of people didn’t want to part with something simply because they considered it to be theirs.

This and other types of negativity bias can lead us astray in making the best choices to start with. On top of that, we have difficulty getting rid of things that aren’t serving us. Thankfully, there are ways to make this duality work to our benefit. We can positively reframe options before us, like considering the 90 percent chance of surviving necessary surgery or choosing a flight with an 88 percent on-time arrival rate. On the flip side, we should equally scrutinize when a salesperson frames a potential sale by removing options versus adding to a baseline. In the case of overcoming perceived ownership, we need to be more watchful about offers for free trials of a service. Remember not to let believing it’s yours sway your decision to keep or cancel. Think instead of whether you’ll use it enough to justify the cost.

We shape new facts to fit what we already know.

Biased interpretation is the root of the confirmation bias we discussed previously. It’s when we rely on our existing beliefs so much that we even take new, conflicting data and shape it to fit the story we’re telling ourselves instead of being objective.

For example, when the author was pregnant with her first child, she happened to read an article in Nature about how babies exposed to any light while sleeping were found to be five times more likely to develop nearsightedness. With that, she crossed night-light off her nursery shopping list.

A year later, Nature reported that the study had come to faulty conclusions. Researchers had failed to consider that the parents of the babies who developed nearsightedness had the condition themselves. Because of that, the parents were more likely to have night-lights in their children’s nurseries. Also, the condition of nearsightedness can be passed on genetically, which was more likely the cause of the babies becoming nearsighted than exposure to night-lights as they slept.

Even in light of this new information, by the time she had her second child, Ahn still refused to use a night-light. Nearsighted herself, the first study’s erroneous correlation between night-lights and infant nearsightedness had stuck in her mind, even when the follow-up study all but debunked it.

Overcoming biased interpretation is tougher than for most other cognitive biases because it’s part of our top-down processing – the subconscious mental framework we use to take in new information. Cognitive behavioral therapy has been shown to be effective yet rigorous. Additional broader solutions to counter possibly biased interpretation include simply being aware of it and how it can cause significant problems at a societal level, such as having long-held prejudices against people who differ from ourselves. This kind of awareness can lead to changing systemic policies and regulations for the betterment of society.

We rarely understand perspectives outside of our own.

The research on how poorly people pick up nuances in tone may frighten you. In one study, friends paired off, with each writing a series of single-sentence emails which they then sent to one another. Some of the sentences were sarcastic. Others were serious. The recipients then had to determine which were which. The results? Their perceptions were accurate only half the time. While these sets of friends perceived the messages accurately when delivered verbally, other studies on ambiguity in verbal communication show significant confusion, even among people who knew one another well.

We have misunderstandings all the time, even with the best intentions and with people we know well. That’s because, despite a desire to see others’ perspectives, research shows we’re really bad at it. When we assume too much familiarity in our communications, things quickly go awry.

While considering and caring about others’ perspectives is a critical first step to understanding, the only surefire ways to get it right are as simple as they sound. Be clear about your own thoughts, even if it means adding an emoji to a text or overstating yourself verbally. With others, don’t try to mind read, guess, or assume, no matter how well you think you know them. Instead, just ask.

We prefer to settle for less now instead of waiting for more later.

If someone offered you $340 now or $350 in six months, which would you choose? If you’re like most people, you’ll go for the $340 now. What if we up that offer to $390 in six months versus $340 now? Even then, most people would go for the immediate reward instead of the promise of $50 more later.

These are typical tests and results that show how we so easily blow off delayed gratification, even in instances where it makes more sense rationally to wait. Take the second question, where many people justify taking the lower amount now by reasoning they could invest the money to make more than the $50 increase. Or, they say some major event between now and six months could prevent them from getting the money. But no market investment in a normal economy has a higher rate of return than the promised increase over that time, and most other events also are unlikely. Yet research proves we still cling to that reasoning, at least when the differences are relatively small.

We struggle with delayed gratification for three reasons. To overcome these we need to consider each of them and take them on individually.

One is simply our lack of self-control. Studies show that one of the most effective ways to resist temptation is to find a useful distraction.

Another is our difficulty sorting through uncertainty, often delaying one decision until some unknown thing is resolved. Be sure that in those instances, the unknown is truly conditional on the outcome. For example, you may take a vacation for different reasons depending on whether you pass or fail a test. If you are going to take the vacation regardless, take advantage of planning ahead.

Last, since it’s difficult to feel future experiences, there’s a disconnect in the present when making decisions for our future selves. To address that, set goals, remind yourself of them often, and imagine how they’ll impact your life for the better.

Summary

You need to be aware of your thinking biases and employ simple techniques to counter them. As a consequence, you’ll make better decisions, which will lead not only to an improvement in your own life but also in society as a whole. Use the strategies outlined in this summary to be fairer to yourself, but also to others through fostering understanding and cooperation.

About the author

WOO-KYOUNG AHN is the John Hay Whitney Professor of Psychology at Yale University. After receiving her Ph.D. in psychology from the University of Illinois, Urbana-Champaign, she was assistant professor at Yale University and associate professor at Vanderbilt University. In 2022, she received Yale’s Lex Hixon Prize for teaching excellence in the social sciences. Her research on thinking biases has been funded by the National Institutes of Health, and she is a fellow of the American Psychological Association and the Association for Psychological Science. Thinking 101 is her first book.

Woo-kyoung Ahn | Website

Table of Contents

Introduction 1
1 The Allure of Fluency: Why Things Look So Easy 7
2 Confirmation Bias: How We Can Go Wrong When Trying to Be Right 37
3 The Challenge of Causal Attribution: Why We Shouldn’t Be So Sure When We Give Credit or Assign Blame 75
4 The Perils of Examples: What We Miss When We Rely on Anecdotes 105
5 Negativity Bias: How Our Fear of Loss Can Lead Us Astray 137
6 Biased Interpretation: Why We Fail to See Things As They Are 163
7 The Dangers of Perspective-Taking: Why Others Don’t Always Get What’s Obvious to Us 193
8 The Trouble with Delayed Gratification: How Our Present Self Misunderstands Our Future Self 223
Epilogue 253
Acknowledgments 257
Notes 261
Index 271

Overview

Psychologist Woo-kyoung Ahn devised a course at Yale called “Thinking” to help students examine the biases that cause so many problems in their daily lives. It quickly became one of the university’s most popular courses. In Ahn’s class, students examine “thinking problems”―like confirmation bias, causal attribution, and delayed gratification―and how they contribute to our most pressing societal issues and inequities. Now, for the first time, Ahn presents key insights from her years of teaching and research in a book for everyone.

Ahn draws on decades of research from other cognitive psychologists, as well as from her own groundbreaking studies. And she presents it all in a compellingly readable style that uses fun examples from K-pop dancing, anecdotes from her own life, and illuminating stories from history and the headlines.

Thinking 101 is a book that goes far beyond other books on thinking, showing how we can improve not just our own daily lives and tackle real-world problems through better awareness of our biases but also the lives of everyone around us. It is, quite simply, required reading for everyone who wants to think―and live―better.

Review/Endorsements/Praise/Award

“An INVALUABLE RESOURCE to anyone who wants to think better.” ―Gretchen Rubin

Award-winning YALE PROFESSOR Woo-kyoung Ahn delivers “A MUST-READ―a smart and compellingly readable guide to cutting-edge research into how people think.” (Paul Bloom)

“A FUN exploration.” ―Dax Shepard

“This book is not just a lucid overview of the cognitive traps that wreak havoc on your reasoning―it’s also an expert’s guide to rethinking how we think.” ―Adam Grant, #1 New York Times bestselling author of Think Again

“Thinking 101 combines the best science with practical advice to help you make better decisions. Ahn’s stories are spot-on, they are humorous, and they show us how thinking can be turned on itself to overcome the biases from, well, thinking!” ―Mahzarin Banaji, Professor of Psychology, Harvard University and co-author of Blindspot: Hidden Biases of Good People

“Every day of our lives, we make judgments―and we don’t always do a very good job of it. Thinking 101 is an invaluable resource to anyone who wants to think better. In remarkably clear language, and with engaging and often funny examples, Woo-Kyoung Ahn uses cutting-edge research to explain the mistakes we often make―and how to avoid them.” ―Gretchen Rubin, #1 New York Times bestselling author of The Happiness Project and The Four Tendencies

“Thinking 101 delivers a world-class tune-up for your brain. It will unclog your mental gears, restart your cognitive engine, and put you on the road to making smarter decisions.” ―Daniel H. Pink, #1 New York Times bestselling author of The Power of Regret, Drive, and A Whole New Mind

“There are other books on typical errors and biases of thinking. But Ahn’s is remarkable. Not only does she limit her coverage to just eight major such thinking problems, which allows her to deeply inform the reader about each with engaging, conversational prose, she also offers compelling, research-based ways to limit the problems’ unwanted impact. The result is a terrific one-two punch.”
―Robert Cialdini, author of Influence and Pre-Suasion

“Thinking 101 is a must-read―a smart and compellingly readable guide to cutting-edge research into how people think. Building from her popular Yale course, Professor Woo-kyoung Ahn shows how a better understanding of how our minds work can help us become smarter and wiser―and even kinder.” ―Paul Bloom, Professor of Psychology, University of Toronto, Brooks and Suzanne Professor Emeritus of Psychology at Yale University, and the author of The Sweet Spot

“Ahn’s book is an absorbing, timely― and I think essential ― guide to how our minds go wrong and what we can do to think better. With lots of humorous stories and cautionary thinking tales, this terrifically-written book is a must-read for anyone who wants to understand and overcome the powerful yet invisible thinking traps that lead us astray.” ―Laurie Santos, Professor of Psychology at Yale University and host of The Happiness Lab podcast

“Woo-kyoung Ahn uses wonderfully engaging examples to show how we can understand and improve our reasoning.”
―Anna Rosling Rönnlund, co-author of Factfulness

“Thinking 101 breaks down when human thinking breaks down and unlike many other books on the topic, this one is accessible, engaging, and fun to read. Woo-kyoung Ahn’s delightful sense of humor shines through, as she uses entertaining stories and examples to compellingly illustrate why thinking errors happen, why it matters, and what to do about it. The book is full of research-backed insights into how the mind works that newcomers to the field will find clear and understandable, but also has a number of gems that more advanced readers will appreciate.” ―Danny Oppenheimer, Professor at Carnegie Mellon University and author of Democracy Despite Itself

“Thinking 101 provides evidence-based advice that has real potential to improve lives.” ―Science

“Ahn excels at illustrating how psychological concepts manifest in everyday life, and her suggestions provide sensible techniques readers can use to push back against cognitive biases. This heady volume provides plenty of food for thought.” ―Publishers Weekly

Video and Podcast