Table of Contents
- What Is Third Millennium Thinking and How Does It Improve Decisions?
- Recommendation
- Take-Aways
- Summary
- People often rely on expert opinions to make consequential decisions.
- The scientific method helps people establish a shared reality.
- Probabilistic thinking – which considers uncertainty – offers an alternative connection to the world.
- People are often overconfident about their beliefs.
- Deciphering real information from “noise” proves difficult for most people.
- Errors come in many forms.
- Learning from experience isn’t easy.
- Confirmation bias is common and insidious.
- Third Millennium Thinking moves toward interdisciplinary approaches that include experimentation, consensus, reduced bias, and teamwork.
- About the Authors
What Is Third Millennium Thinking and How Does It Improve Decisions?
Learn how to navigate information overload and make better decisions using tools and probabilistic thinking from Third Millennium Thinking by Saul Perlmutter. Ready to cut through the noise and make smarter, more strategic decisions in a confusing world? Keep reading to discover how the interdisciplinary strategies of Third Millennium Thinking can transform your approach to problem-solving!
Recommendation
Due to the internet, vast amounts of information – much of it specialized and of dubious quality – is so readily available that few people can determine what’s relevant and meaningful. Physics Nobel Laureate Saul Perlmutter, philosopher John Campbell, and social psychologist Robert MacCoun explain how to use tools and quantitative methodologies from the natural and social sciences to navigate the information avalanche. Their book, which began as material for a class at the University of California at Berkeley, provides a template for understanding a confusing world. It can help you make better, more strategic decisions at work and in your private life.
Take-Aways
- People often rely on expert opinions to make consequential decisions.
- The scientific method helps people establish a shared reality.
- Probabilistic thinking – which considers uncertainty – offers an alternative connection to the world.
- People are often overconfident about their beliefs.
- Deciphering real information from “noise” proves difficult for most people.
- Errors come in many forms.
- Learning from experience isn’t easy.
- Confirmation bias is common and insidious.
- Third Millennium Thinking moves toward interdisciplinary approaches that include experimentation, consensus, reduced bias, and teamwork.
Summary
People often rely on expert opinions to make consequential decisions.
Available information swamps you online. For example, one archive of websites contains nearly a trillion pages of digital content, and that’s just the material dating back to 1996. Online digital content at this dizzying scale includes nearly every topic – from fresh, specialized technical information to outdated scientific and medical data – as well as malicious misinformation. Facing this scattered, often inconsistent sea of information makes it challenging to find reliable, relevant, and useful data – especially when you need information to help you make a difficult or urgent decision.
“To make a sound decision, take a meaningful action, or solve a problem – whether as individuals, in groups, or as a society – we need first to understand reality.”
Suppose, for instance, you undergo a dramatic, unexpected health episode. You black out, collapse, and wake up in a mediocre local hospital. The on-duty neurologist thinks you have a brain bleed and might need risky emergency surgery. But the surgeon is not a world expert, and surgery might make your condition worse – or lead to your death. How do you decide what to do?
Since you have no neurological training, you must find an expert in neuroscience whom you can trust with your life. You need worthy advice that springs from scientific information, not trendy ideas or groupthink.
In the United States, science can be divisive. For example, hardcore conservatives insist that climate change and global warming either don’t exist or don’t pose a serious risk to life on Earth. Conversely, liberals foresee the collapse of civilization if global warming continues at its current pace. You might think that those with whom you disagree simply don’t grasp the relevant science. But there are scientifically literate people across the spectrum of viewpoints. While the issues have become politicized – and some people manipulate science itself for political ends – society must address important matters, such as climate change, on the basis of one mutual reality. Hardly anyone believes that the real truth about the external world is different for different people on the basis of their opinions.
“How are we to reach agreement on what’s ‘out there’ in reality, if both sides are capable of weaponizing the evidence to fit with the ideas they already have?”
People acquire a shared sense of the external or physical world through their senses: sight, touch, hearing, taste, and smell. If you and others can see and touch an object in front of you, then you share an external reality.
People now can call on intermediaries at various levels of sophistication to amplify and extend their senses. Glasses improve eyesight, microscopes provide access to an otherwise invisible world, and telescopes extend humanity’s vision to outer space. Hearing aids sharpen hearing, and your smartphone provides a detailed analysis of sounds. The use of instruments to perceive external reality enhances scientific experiments and people’s basic sense of common reality. However, even instruments that enhance people’s senses must undergo testing through multiple, shared observations.
Evidence-based science transformed how people think about reality. They now understand that the world consists of far more than they can see, including cells and subatomic particles. In addition, science and the scientific method require seeing the world as a web of causes and effects. Understanding causes and effects – not just correlations between phenomena – enables people to intervene and change the world. For example, it can help people figure out how to treat diseases.
Probabilistic thinking – which considers uncertainty – offers an alternative connection to the world.
People may agree they live in a common, shared world. But no one knows everything, and knowledge isn’t absolute. In fact, knowledge brings a degree of uncertainty. That makes people uneasy, though it’s crucial to understand “what we don’t know or only partially know.” Fortunately, scientific thinking provides a way to incorporate uncertainty into the process of determining a confident, productive way to act.
“Science offers us a radically different way to think about our connection to this reality that we know something, but not everything, about.”
Science provides the possibility of working with ideas people may not know absolutely, but know with some degree of confidence. Scientists don’t work in absolutes. Along with more and more nonscientists, scientists think in probabilistic terms. They attach degrees of uncertainty and degrees of confidence to their beliefs. That is, scientists quantify their confidence. They’ll never say they’re 100% certain about anything that hasn’t already happened. If they’re confident, they might say they’re 99.999% certain.
People are often overconfident about their beliefs.
Assigning probabilities and degrees of confidence to your beliefs makes you less attached to those beliefs and more amenable to changing your mind and pursuing alternatives. When you’re deliberating decisions, quantifying your confidence in a belief allows you to give that belief the appropriate weight in your decision-making. Nonetheless, even accomplished scientists can be overconfident about their beliefs. For example, in 2020, a respected scientist asserted that COVID-19 would play out in four weeks with fewer than 170,000 deaths. The scientist was spectacularly wrong – COVID hasn’t disappeared, and it has killed more than a million people worldwide. The scientist should have assigned a degree of confidence to this assertion.
“Expert overconfidence can have grave consequences.”
Similarly, experts should be humble and open to the possibility of error and alternative conclusions. One study found that 44% of scientists question at least some of their published findings. Scientists must calibrate their levels of confidence and check if their estimated confidence matches the likelihood of something happening in the world. When you rely on experts, stay mindful of how much uncertainty they’re willing to acknowledge. No one can be 100% certain. And no one is infallible.
Deciphering real information from “noise” proves difficult for most people.
Scientists’ first passes at an issue are often misguided. In one incident, cosmologists faced an avalanche of data that pointed in a fascinating direction. Eventually, they realized this groundbreaking data was a side effect of the character and position of the instrument they were using. They had misinterpreted the data.
More precisely, they had confused collateral information, or noise, with relevant data, a “signal.” In this context, a signal is data providing information, and noise is whatever impedes the work of deciphering that signal. For example, despite receiving plenty of relevant information, American intelligence officials failed to identify the impending September 11, 2001, terrorist attacks. They couldn’t locate the crucial signal amid all the noise.
“Scientists often have to design techniques to pull out a signal buried in noise, and this requires knowing quantitatively how deep the signal is buried.”
Quantifying the “signal-to-noise ratio” helps determine the signal’s strength and clarity. If the ratio is 9:2, the scientist is in good shape – the signal is strong and clear. If the ratio is 2:1, the situation becomes problematic. The brain evolved to detect patterns, even in noise, and that tendency can easily fool people. Remain aware that noise can deceive you. Find ways – perhaps through science – to identify real signals in the noise that washes over everyone every day.
Errors come in many forms.
People make errors. No one can achieve 100% certainty, but you can establish a standard of proof that you require before you draw a conclusion. Among other forms, errors come in false positives and false negatives. When you’re making decisions, both forms can be consequential. For a juror in a criminal trial, for example, embracing either form can be the difference between exonerating a guilty person or convicting an innocent one.
“In a world full of noise and uncertainty, we are bound to make errors. Often we care more about one type of error than another.”
Errors can prove especially unsettling when they connect to decisions, such as a verdict or a broader policy choice. Policymakers in particular evaluate the implications of potential errors through a political, not scientific, lens. Setting statistical trade-offs regarding possible errors can be troubling in public policy. Sample sizes are typically small, measurement is often fuzzy, and information is frequently scarce or unavailable. Sometimes leaders can address public policy issues by adopting a provisional course of action, seeing what happens, and re-evaluating the policy on the basis of newly uncovered facts. Otherwise, policy-makers tend to do what people often do in their personal lives. They will wait until the data improves and the signal-to-noise ratio becomes advantageous before they make a decision.
Learning from experience isn’t easy.
People assume they learn from experience. American culture values this concept, which has intuitive appeal. When people need a surgeon or even an auto mechanic, they prioritize turning to professionals with experience. They assume experienced people will be better at their jobs than those with fewer years on the job. Yet older, more experienced workers often don’t outperform younger workers who have good initial training. In medicine, for example, older doctors sometimes perform less well than their younger colleagues because they lack full awareness of new developments in their field.
“Many of the factors that make it hard to learn from experience are psychological rather than environmental.”
People do many things out of habit, with little self-conscious thought or reflection. Habits are efficient – they allow you to perform common tasks, even those that require advanced skills, without investing energy in thinking about them. Bad habits can become so ingrained that people aren’t even aware of them. When people learn new skills, they think about what they’re doing and how they’re doing it. Once a skill or activity becomes a habit, it’s difficult to change.
Confirmation bias is common and insidious.
People do learn from experience – otherwise, the entire scientific approach would be pointless. But people can suffer from biases in ways that block such learning. For example, they fall prey to confirmation bias – the predisposition to seek evidence that confirms their existing ideas – and, therefore, ignore evidence that contradicts them.
“Probably the most successful de-biasing strategy identified to date is called consider the opposite – or, in more complex cases, consider the alternative.”
When you seek expertise – from scientists or other experts – try to determine if they’ve made an effort to reduce their confirmation bias.
Third Millennium Thinking moves toward interdisciplinary approaches that include experimentation, consensus, reduced bias, and teamwork.
Third Millennium Thinking introduces the tools and techniques for a new kind of enlightenment – a move toward interdisciplinary and probabilistic thinking. It calls for emphasizing experimentation, building consensus, adopting ways to reduce bias, and engaging in teamwork.
“We have the Third Millennium Thinking ingredients, we have the motivation, and we have the reasonable optimism that could make this the millennium in which we advance as a global human family.”
Third Millennium Thinking offers updated versions of the scientific practices that allowed people to overcome countless challenges over the millennia. So far, the 21st century has presented significant challenges, including climate change. But people can successfully address these challenges if they apply “better collective thinking.”
About the Authors
Saul Perlmutter won the 2011 Nobel Prize in Physics. He is professor of physics at the University of California at Berkeley, where John Campbell is a professor of philosophy. Robert MacCoun is a professor of law at Stanford University, and a senior fellow at the Freeman Spogli Institute for International Studies.