Skip to Content

Summary: The Loop by Jacob Ward

  • The Loop is a gripping medical thriller that investigates the ethics of radical cryonic life expansion through rich character study.
  • Pick up this electrifying book if you enjoy an intimate page-turner steeped in controversies of cutting-edge biotechnology.

Recommendation

Sophisticated technologies and deepening knowledge of unconscious human behavior drive people to fundamentally change their behavior without knowing how or why – and often for the profit of technology companies. Unconscious forces drive human behavioral patterns, such as racial bias and addiction. Contemporary technologies can harness those forces for political and economic purposes. Technology feeds humans’ worst predispositions – emotional, political, even aesthetic – back to them in ways they can’t resist. NBC technology correspondent Jacob Ward describes the three “loops” by which technologies compromise how people make decisions and live. Ward’s passionate insights will engage anyone concerned by digital technologies and artificial intelligence (AI) affecting contemporary society – and their own decision-making.

Take-Aways

  • “Unconscious habits” inform human experience.
  • Three basic rules govern people’s decisions and prove crucial to how technology controls people’s lives.
  • Human behavior seems free, but “guidance systems” control it.
  • Businesses use technologies to manipulate human behavior, yet people trust them.
  • People don’t understand AI’s nature, limits or manipulative power.
  • Technologies may so dominate life that people may forget how to choose what they most like or how to communicate with one another.
  • AI can’t improve everything.

Summary: The Loop: How Technology Is Creating a World Without Choices and How to Fight Back by Jacob Ward

“Unconscious habits” inform human experience.

Following the First World War, seriously damaged men streamed through Austrian medical clinics, and many had disjointed perceptions of the world. For some, ordinary peripheral perceptions that people usually don’t notice were accentuated and unbearable.

Austrian neurologist and psychiatrist Otto Pötzl treated and researched these patients. His 1917 essay about one patient described how the brain receives and processes information without all of it entering consciousness. Subsequent research showed that the reality humans experience is only the best, most efficient version the brain pieces together from an overwhelming excess of information.

“We believe the story our mind is telling us because we believe that’s the only story there is…As we build machines and systems that organize, simplify and mutate our stories for us, we are just as vulnerable to believing those new tales as well.”

The brain isn’t a “closed system.” The brain can unconsciously reconstruct perceptions from all the senses and unconsciously assimilate and communicate emotional states. Researchers are increasingly identifying the unconscious patterns and predispositions people develop over the course of their lives that guide their decisions and actions. The science behind this may be young and undeveloped, but people in politics and business already exploit people’s unconscious patterns in order to manipulate their behavior.

Three basic rules govern people’s decisions and prove crucial to how technology controls people’s lives.

Between 1971 and 1979, psychologists Daniel Kahneman and Amos Tversky published a series of papers that proved fundamental for the burgeoning field of “behavioral guidance” that seeks to influence and shape people’s decisions. In a groundbreaking 1974 essay, Kahneman and Tversky articulated unconscious biases people bring to decision-making and three “heuristics” that distort human decision-making: “representativeness, availability and anchoring.”

Representativeness reflects what people associate with a specific category; this can lead people to misjudge whether something is in that category, such as whether a person is a doctor. Availability shows that people regard things that come to mind quickly and easily as more likely to occur – such as crime. The fact that you remember a crime from, say, seeing one on the news, doesn’t mean that it’s more likely to occur again. Anchoring means that a probability people already have in mind influences their estimations of the probability of something occurring. The probability you start with shapes the probability you arrive at – regardless of evidence you find in the meantime.

“Kahneman and Tversky had identified three handy ways we make short work of the world’s complications, but they also knew the three heuristics posed a real danger to our ability to make good choices in the modern world.”

Other heuristics followed. Baruch Fischhoff, a graduate student working under Kahneman and Tversky, discovered “hindsight bias”: Things that have already happened seemed like they couldn’t have happened otherwise – and people gave them a higher probability of recurring than things that hadn’t happened yet. Fischhoff’s student Paul Slovic created a firm called Decision Research that explored, among other things, people’s level of irrationality in assessing the risk of something bad happening. Slovic concluded that emotions, like representativeness and availability, are part of an unconscious process through which people make decisions.

This research helped uncover the systematic “patterns of human irrationality” that shape decision-making. Humans are vulnerable to irrational decision-making. And people are designing technologies that play on those vulnerabilities and that people don’t know how to resist. This is part of what Ward calls the first loop.

Human behavior seems free, but “guidance systems” control it.

Robots have difficulty performing simple human tasks. For example, robots at a competition sponsored by the Defense Department think tank Defense Advanced Research Projects Agency (DARPA) couldn’t climb a ladder, much less undertake elaborate missions. The problem lies in the massive amount of data required to perform such actions. People must help robots process all that data explicitly and in a way, consciously. Human beings don’t operate that way.

“We are geared to unconsciously offload difficult cognitive tasks to automatic systems, whether they be our emotions or a collection of shiny but unreliable robots.”

In 2000, psychologists Keith Stanovich and Richard West concluded that human minds operate in accordance with two systems, “System 1” and “System 2.” System 1 makes quick, unconscious, relatively straightforward decisions of the kind that impede robots from climbing ladders. System 2 is responsible for the more reflective, analytical decisions that require time and energy. Most judgments people make fall under System 1, but System 2 oversees System 1. Most failures in rational choice are failures in both systems. System 1 causes the failure; System 2 doesn’t detect it.

People’s brains deceive them into thinking they make free choices. In fact, their behavior is mostly driven by unconscious forces or “guidance systems.” Even though people’s choices are nowhere near as free as they imagine, they judge and deplore people whose behavior is influenced by forces they can’t control, such as those living in poverty or addicts. Humans created technologies, and the for-profit businesses associated with them, that exploit human cognitive vulnerabilities and delusions. They promote activities, such as substance abuse, regarding which people have scant ability to assess costs and benefits.

Businesses use technologies to manipulate human behavior – yet people trust them.

People resist uncertainty, need reassurance and more often than not remain blind to lessons they might learn from the past. Their emotions drive their decision-making. To some extent, this mental partitioning provides evolutionary advantages.

“While I believe it’s clear that the mental and physical health of entire generations could be at stake, I also believe that capitalism, culture, and our conviction that we are in charge of our own destinies are blinding us to the threat.”

Entrepreneurs grasp that they can exploit people’s unconscious predispositions for business purposes. Companies are busy finding ever more precise ways to influence human behavior. The unconscious in general fuels human behavior, but for-profit businesses dependent on digital technologies sample those unconscious patterns and reproduce them for customers. These are manipulative strategies, but people believe and trust them.

Facebook and Google, for example, deploy such approaches with vast computing capacity and “algorithmic sophistication.” Systems people don’t understand or find confusing reduce their critical acumen. System 2’s functions for correcting automatic System 1 can easily suffer disruption. For decades, people have placed unjustified levels of trust in what machines tell them. For Ward, this is the second loop’s core.

People don’t understand AI’s nature, limits or manipulative power.

Invented in a feverish burst of creativity in the mid-1950s, AI was an intellectual revolution that spurred a third and more advanced phase – the third loop – of how technologies shape human behavior and life.

“AI looks impossibly sophisticated and entirely inscrutable. We, the people whose behavior it will shape, just don’t understand what it is and what it isn’t.”

AI isn’t a “robotic intelligence” designed to replicate and ultimately replace the human mind’s complex and often subtle operations. An AI is any system that learns through data to perform a given task. AI systems learn to perform tasks in at least three ways. In “machine learning,” the system makes predictions based on and limited to existing data patterns. In “supervised learning,” the system deciphers patterns and correlated outcomes within accurately “labeled data” and thereby predicts future patterns and correlations. In “reinforcement learning,” the system eliminates incorrect raw data, approves of correct data and searches for patterns in the correct data.

AI is worrisome precisely because the human mind is likely to trust whatever the AI system tells it. The critical, reflective System 2 mind is always ready to offload decision-making to the emotions, impulses and unconscious patterns inscribed in System 1.

AI systems need an “objective function.” An AI system’s objective function is whatever task(s) humans want the system to perform. Another important feature of such a system is how rigorously the system pursues its objective function. An AI system, and the way the machine-learning system enables it, will remain a mystery to most people. The system provides answers – which people are naturally predisposed to believe and trust – but doesn’t reveal how it arrives at the answers.

Machine learning systems do their work without any transparency. This is inconsequential for trivial tasks, but it becomes ethically problematic as AI takes on increasingly sophisticated human aims, such as choosing which person will do a job optimally. Some people advocate for greater transparency or “explainability” in AI, but achieving that presents formidable technical challenges.

Technologies may so dominate life that people may forget how to choose what they most like or how to communicate with one another.

“Pattern-recognition technology” can shape human life at every level and all over the world. Algorithms, for example, can select what you eat, drink and wear, and what entertainment event you should go to. These algorithms will eventually connect with one another and literally shape an entire human life. In so doing, this conjunction of algorithms will limit human freedom and emphasize people’s most problematic unconscious impulses.

“I worry that as we become caught in a cycle of sampled behavioral data and recommendation, we will be instead caught in a collapsing spiral of choice, at the bottom of which we no longer know what we like or how to make choices or how to speak to one another.”

When AI dominates one part of your life, it inevitably spreads to every other part of your life. For example, early on in the COVID-19 pandemic, public health officials discussed a COVID-19 “passport” for those with a negative test or vaccination. Google and Facebook planned to create a Bluetooth-driven app that would alert people when they had been close to other people who had agreed to have their contacts recorded and contracted the infection.

Soon enough, a company used AI to develop drones that could identify people with COVID-19 symptoms from the air. Such surveillance systems could ultimately gather much other data, such as people’s heart rate and skin tone. And its operators could employ such a system for purposes other than health data. For example, a police department that already monitored people at train stations and used facial recognition software, contacted the company and proposed collaborating. Police-operated AI surveilled and analyzed people in this commuter town near New York City, and they knew nothing about it. Nothing in these systems limits what police departments or other institutions do with the data they accumulate.

AI can’t improve everything.

Pattern-recognition technologies such as AI or Facebook’s algorithms, by design, engage people’s biases and the shortcuts System 1 takes in decision-making. They also pander to for-profit incentives. But people may well use these technologies, and their accompanying business incentives, for good. Ultimately, the crucial question is whether for-profit companies may infiltrate every aspect of people’s lives with AI, or whether people can preserve and protect traditional, slow, System 2 decision-making.

About the Author

Jacob Ward is a technology correspondent for NBC News, reporting on technology’s social implications.

Genres

Science fiction, Thriller, Medical fiction, Technothriller, Biopunk, Cryonics, Neuroscience, Psychological thriller, Medical ethics

Review

The Loop follows biotech entrepreneur Jim Allensworth who is diagnosed with a rare neurological disease that causes rapid memory loss and death within months. Desperate for a backup plan, Jim invests his fortune into a secretive cryonics company called the Loop – essentially a giant freezer for preserving human bodies so they can be revived in the future when a cure exists. Jim moves himself and his team of doctors, scientists and loved ones to a remote research facility to rigorously prepare his body and mind while attempting to uncover the Loop’s confidential revival process.

However, as Jim’s cognition rapidly deteriorates, he discovers the sinister truth behind the Loop’s wealthy clientele and the company’s questionable plan for revival. Jim and his team boldly attempt resistance, only to uncover the unfathomable economic and political scope of theuv Loop’s globally interconnected ambitions.

This fast-paced medical thriller explores the ethics of radical life extension through intimate character drama. Ward builds suspense through Jim’s tragic mental dissolution while keeping poignant humor and intellect intact. Expertly layered, the book investigates hubris in the face of mortality and the insidious effects of extreme wealth inequality, all while humanizing the universal plight for more time.

Nina Norman is a certified book reviewer and editor with over 10 years of experience in the publishing industry. She has reviewed hundreds of books for reputable magazines and websites, such as The New York Times, The Guardian, and Goodreads. Nina has a master’s degree in comparative literature from Harvard University and a PhD in literary criticism from Oxford University. She is also the author of several acclaimed books on literary theory and analysis, such as The Art of Reading and How to Write a Book Review. Nina lives in London, England with her husband and two children. You can contact her at [email protected] or follow her on Website | Twitter | Facebook

    Ads Blocker Image Powered by Code Help Pro

    Your Support Matters...

    We run an independent site that is committed to delivering valuable content, but it comes with its challenges. Many of our readers use ad blockers, causing our advertising revenue to decline. Unlike some websites, we have not implemented paywalls to restrict access. Your support can make a significant difference. If you find this website useful and choose to support us, it would greatly secure our future. We appreciate your help. If you are currently using an ad blocker, please consider disabling it for our site. Thank you for your understanding and support.