Skip to Content

Discover How to Master Practical Reasoning in Everyday Life with Decisions about Decisions by Cass R. Sunstein

Dive into the fascinating world of decision-making with Cass R. Sunstein’s groundbreaking book, “Decisions about Decisions: Practical Reason in Ordinary Life.” Prepare to gain invaluable insights that will transform the way you approach choices in your daily life.

Genres

Psychology, Behavioral Economics, Decision Theory, Cognitive Science, Self-Help, Personal Development, Philosophy, Ethics, Sociology, Public Policy

Discover How to Master Practical Reasoning in Everyday Life with Decisions about Decisions by Cass R. Sunstein

In “Decisions about Decisions,” Cass R. Sunstein explores the complexities of practical reason in everyday life. He delves into the psychological, social, and cognitive factors that influence our decision-making processes. Sunstein examines how we navigate choices, weigh options, and ultimately make decisions that shape our lives.

The book offers a comprehensive analysis of decision-making, drawing from various disciplines such as psychology, economics, and philosophy. Sunstein presents a framework for understanding how we reason and make judgments in real-world situations. He highlights the role of heuristics, biases, and social influences in shaping our choices.

Throughout the book, Sunstein provides practical insights and strategies for improving our decision-making skills. He emphasizes the importance of critical thinking, self-reflection, and considering multiple perspectives. By understanding the underlying mechanisms of decision-making, readers can develop a more deliberate and effective approach to making choices.

Sunstein’s writing style is engaging and accessible, making complex concepts easy to grasp. He uses relatable examples and case studies to illustrate his points, allowing readers to connect the ideas to their own experiences. The book strikes a perfect balance between academic rigor and practical application, making it valuable for both scholars and general readers.

“Decisions about Decisions” is a must-read for anyone seeking to enhance their decision-making skills and navigate the challenges of everyday life with greater clarity and confidence. Sunstein’s insights will empower you to make better choices, both personally and professionally, and lead a more fulfilling life.

Recommendation

Cass R. Sunstein is a reliable, smart, and prolific author who often writes in depth about how people think. He co-authored Nudge and has written many perceptive books. Here, Sunstein provides a set of useful decision-making strategies, including schemes that eliminate routine decisions, dig out relevant information, neutralize unconscious biases that can undermine your judgment, and let you off the hook when you should delegate a decision to experts or algorithms. Sunstein offers a detailed, sometimes technical tour of a range of tools to help you avoid decision paralysis, make smart decisions, or, in some cases, circumvent the need for a decision altogether.

Take-Aways

  • Decision-making can evoke strong emotions; emotions influence judgment.
  • Decision-making can be straightforward.
  • Information plays a complex role in decision-making.
  • Many beliefs stem from a decision to believe.
  • Social dynamics influence consumer decisions.
  • Using computer-based algorithms is a burgeoning “second-order” decision-making strategy.

Summary

Decision-making can evoke strong emotions; emotions influence judgment.

People have devised a number of strategies to mitigate the emotional stress of making a decision. These “second-order” decision-making strategies are templates for resolving various types of dilemmas. For example, one strategy suggests avoiding personal decision-making altogether by leaving the choice up to chance, such as using lotteries to choose people for jury duty.

“Material outcomes matter. But people’s emotional experiences also matter, and when we make decisions, we ought to focus on that fact.”

People also avoid making decisions by setting up rules that predetermine their options. For example, you might establish a rule that you will always follow your doctor’s advice on health issues, instead of being indecisive. Another strategy involves breaking a problem into steps and creating a sequence of smaller decisions about each one.

Decision-making can be straightforward.

Experts in economics, politics, and psychology view decision-making as a straightforward process: they often advise examining the pros and cons of various proposals and applying that data to choose an option. In daily use, people also turn to a range of second-order strategies that serve as road maps to help them overcome obstacles, such as insufficient information or the influence of biases.

Some of these strategies require substantial thought in advance but little thought on the spot, whereas others require little thought either before you make a choice or on the spot. However, each second-order decision-making strategy steps in to raise or lower the burdens associated with each stage of making a decision:

  • “High-low” strategies — These methods involve establishing protocols for handling routine issues. They lower the stress of making a final choice but may necessitate high levels of planning.
  • “Low-low” — This approach minimizes the burdens people face throughout the decision-making process. Decision makers may use randomizing tools such as coin flips and lotteries, “mental shortcuts” such as choosing a political candidate according to party affiliation, or a more detailed solution such as breaking a problem into smaller steps. Low-low strategies are most appropriate when the decision-maker lacks sufficient information about the problem.
  • “Low-high” — Using this approach, the person responsible for producing a decision delegates the choice to someone else, such as an expert on the issue. This lowers the first party’s burden but can impose high decision costs on the designated arbiter.
  • “High-high” — This produces high burdens both while preparing for a decision and when making the final choice. It is appropriate only for very high-stakes decisions.

Information plays a complex role in decision-making.

Clearly, information is a necessary element of decision-making, but people often avoid it. They may exclude information they see as irrelevant to them, but they also may avoid potentially useful information if they suspect it will upset them.

“Behavioral scientists have…shown that our decisions are not always entirely rational.”

People tend to assess and accept information not according to its merit, but according to how they see the value of knowing it or avoiding it. They might evaluate information’s “instrumental value,” that is, whether it enhances their autonomy or power. For instance, people may be more likely to accept empowering information that might help them make profitable investments, improve their work performance, support their health, or choose environmentally friendly products.

Decision-makers also assess information’s “affective value,” that is, if it evokes positive emotions. Even information of little practicality can be valuable if it promotes someone’s sense of meaning or well-being.

People also may seek information because they are curious. Learning about topics such as science, other cultures, or history may not have an immediate practical use, but it can make life richer and more meaningful.

“It is reasonable to suspect that in daily life, emotional reactions to good news and bad news are a major determinant of whether people decide to find such news to be credible.”

To determine in advance whether they want to know something, people consider what they already know and use that to make an internal prediction about whether receiving the new information will produce pleasant or unpleasant feelings. However, these predictions often are not reliable because of the influence of bias and other cognitive quirks. For example, under the influence of “present bias,” people focus on current conditions and ignore the long term. They avoid information they suspect might distress them in the present, even if it might prove valuable in the future.

People believe some ideas because having that belief makes them feel good. Believing positive information about yourself and the world around you is pleasant. People also avoid information that contradicts their beliefs. When they encounter new information, they determine whether to accept it by assessing – often unconsciously – the utility of sticking with their old belief or adopting the new one.

People’s willingness to hear bad news also can depend on whether they believe they can use it to improve their condition. For example, people who have difficulty with self-control when deciding what to eat are more likely to avoid reading the calorie labels on foods. Other people welcome information about food’s calorie content because they anticipate that this knowledge will improve their diets.

Many beliefs stem from a decision to believe.

Some beliefs don’t require a conscious choice. Sometimes seeing is believing– you have seen apples fall when you drop them, so you believe that if you drop an apple, it will fall. But with concepts you can’t verify personally, such as the structure of the solar system or the existence of dinosaurs, you decide whether to believe. You come to a conclusion by evaluating available information that supports or contradicts these ideas.

Sometimes people choose a belief for its utilitarian value. For example, French philosopher Blaise Pascal argued in the 1600s that believing in God is a smart bet to make, regardless of its accuracy: If God exists, you benefit, he said. If that belief proves false, you lose nothing.

“People decide (to believe) what they want to believe.”

Climate change illustrates the fraught relationship between information and belief. The volume of information about climate change can give rise to a variety of feelings. Research on acceptance of beliefs about the climate demonstrates how people with different beliefs respond to new information. Those who strongly believe in anthropocentric climate change and those who are “less certain” about it both employ strategies of “asymmetrical updating.”

People who strongly believe that climate change is a growing threat may be more accepting of bad news than those who disregard global warming’s danger because ill tidings support their conviction that the world must address this issue. Those who are less certain about climate change’s dangers employ an “opposite asymmetry” — they are more likely to modify their beliefs when they encounter good news about the climate. A third group, those with a “moderate” belief in climate change, tends to give the same weight to both good and bad news.

“Motivated reasoning,” which stresses the role of emotions, is one explanation for these differences. For example, when it comes to their health, people are more likely to accept good news – because it evokes positive feelings – and to find reasons to discount bad news.

“People are more likely to check their investment portfolios and to learn whether they are gaining or losing money during periods when the stock market is going up than during periods when it is going down.”

People change their beliefs when the outcome of a new belief holds more value for them than the payoff for clinging to an old belief. Their willingness to maintain or change their beliefs depends on how valuable a belief is to them.

Holding certain beliefs can lead to “outcomes” that are either “external” or “internal” and either positive or negative. External outcomes include tangible consequences, such as financial rewards. Consider these two types of external outcomes:

  1. “Accuracy-independent” outcomes — These rewards, such as money or “social acceptance,” accrue to those who hold a certain belief, regardless of whether it is accurate. This would come into play, for instance, when holding certain views improves an applicant’s chance of getting a particular job.
  2. “Accuracy-dependent” outcomes — This describes the benefits that result from accurate beliefs and the costs that stem from inaccurate beliefs. For example, someone who invests in the stock market because they believe prices will rise could win or lose, depending on whether their beliefs correctly reflect the market’s behavior.

“Internal outcomes” are the “cognitive or affective” consequences of a belief:

  1. Accuracy-independent outcomes — Having a positive belief in the future leads to having a positive attitude in the present.
  2. Accuracy-dependent outcomes — Students who believe they will get a good grade on an upcoming test could feel sadness or chagrin if they fail.

Policymakers who are trying to predict or influence people’s beliefs should consider all these possible outcomes. Policies that promote transparency, such as regulations calling for more information on product labels, operate on the assumption that people seek accurate data.

“In recent years, many people have changed their beliefs about what constitutes workplace harassment, and about whether smoking in public venues is acceptable.”

When public health agencies encourage people to get a COVID-19 vaccine, for example, they stress external accuracy-dependent outcomes. To persuade people to participate, they share information about the vaccine’s safety and effectiveness.

A fact-based effort to counter “fake news” may fail because correcting inaccurate information that people believe to be true threatens their sense of well-being. If belief in false information is important in their social circle, they will be reluctant to accept facts that contradict their beliefs.

Social dynamics influence consumer decisions.

People choose products for both their inherent value and for their social value. That means part of the value of enjoying a popular song comes from knowing that others like it, since that engenders a sense of community. Such “social goods” include events that people consume in groups — such as seeing a movie in a theater — or in private, but with the knowledge that others are enjoying the same thing — such as streaming a TV series.

“Exclusivity goods” are a form of social goods that rise in value if only a limited number of people can enjoy them. That means, for instance, that the value of being able to buy admission to an exclusive resort will decrease if too many people have access to it.

“A society that contains few solidarity goods is likely to have a wide range of problems.”

Governments often promote and fund “solidarity goods” that enhance societal well-being, such as educational television programming, sports teams, national holiday celebrations, or the protection of public resources, such as wildlife refuges and historic sites.

Using computer-based algorithms is a burgeoning “second-order” decision-making strategy.

“Prediction problems,” a common pitfall in decision-making, arise when people assess the likely consequences of their options. In many situations, algorithms prove more effective than human beings at forecasting future conditions. For example, judges have to decide whether to release defendants who are awaiting trial. A judge must assess the risk that a defendant will flee or commit a crime if released. One study found that an algorithm with access to the same information as the judge made more accurate predictions about whether particular defendants would commit a crime in the future.

Judges’ predictive powers go astray when “current offense bias” affects their reasoning – that is when they regard the severity of current charges against a defendant as an indication of the potential threat he or she might pose if released. The more serious the current crime, the more likely the judge is to view the defendant as high risk. Algorithms, which view current charges in the context of the defendant’s history, can help judges avoid acting based on this bias.

“Deciding by algorithm may not bring a bright smile to the face, but in important domains, it is the best decision about decisions.”

Experts’ judgment can falter due to “availability bias.” When people assess the probabilities in a particular situation, they draw on their knowledge of or experience in comparable situations. They are most likely to refer to whichever comparable situations come to mind first, often the most recent ones. As a result, for example, doctors are more likely to test for certain conditions if they recently treated other patients with those conditions. They substitute a mental shortcut for a more thorough consideration of relevant statistics.

About the Author

Cass R. Sunstein is the founder and director of Harvard Law School’s Program on Behavioral Economics and Public Policy, and former head of the White House Office of Information and Regulatory Affairs. He has authored or co-authored many books including Nudge: Improving Decisions About Money, Health, and the Environment; Noise: A Flaw in Human Judgment; Wiser: Getting Beyond Group Think to Make Groups Smarter; How to Become Famous: Lost Einsteins, Forgotten Superstars, and How the Beatles Came to Be, and more.