Thinking, Fast and Slow

How human minds make decisions.

Thinking, Fast and Slow. By Daniel Kahneman. Farrar, Straus and Giroux, 2011. 512 pp.

The popular American television series Mad Men depicts a stereotypical workplace dynamic that never seems to go out of style: the brilliant manager who takes all the credit for closing the deals while delegating “the details” to his assistant. The bosses can come in late and take leisurely lunches, but the assistants are the ones who know where the clients’ files are and make sure their bosses show up to pitch meetings on time. I kept thinking about Mad Men as I read Thinking, Fast and Slow, Daniel Kahneman’s treatise on what influences human behavior. The TV characters may be clichés, but they are part of the everyday patterns of behavior that Kahneman, the recipient of a Nobel Prize in Economics in 2002, explores and explains with brilliant observation and insight.

The book tells a story that captures the states of both the art and science of human nature, as chronicled by the prophets of experimental cognitive psychology over the past half-century. The story is about two likeable but flawed characters, who Kahneman calls System 1 and System 2. System 1 is the under-appreciated administrative assistant of our mental life, the one who is constantly vigilant for any signs of danger or opportunity and “thinks fast.” System 2 is the “slow-thinking” rational manager who shows up at the last moment and yet truly believes he or she deserves all the credit. System 1 handles routine day-to-day matters using highly efficient filing systems and makes quick judgments with impressive accuracy. However, System 1 is also prone to certain systematic errors, such as assuming that the information at hand is the sum of all the relevant information to be gained, or assigning causal relationships to unrelated events. System 2 is capable of more refined, critical thinking, but is also lazy and prefers to allow System 1 to handle as much as possible, even when a given task is probably above the assistant’s pay grade. By allowing these characters to emerge through pithy descriptions of key psychological experiments, Kahneman gives meaning to a vast range of real world experiences.

For example, why do people tend to buy insurance right after a disaster? One of the quirks of System 1 is that, in order to make judgments rapidly and constantly, we have a tendency to substitute easy questions in place of hard questions, without even realizing that we are doing so. So, instead of asking about the overall likelihood of earthquakes, tornadoes, and floods, we ask how recently one of these disasters occurred. We use that answer to determine whether we need insurance.

Or why can it be advantageous to be the first person to name a figure when negotiating a salary, the price of a new car, or a divorce settlement? If we aren’t sure the exact worth of something, our System 1 will accept any suggested figure as a starting point—even a number tossed out at random—and tweak it upward or downward until it lands on a number that feels right. The upshot is that if you start out with a high anchor number, the final figure is likely to be much higher than if you start out with a low anchor number.

And why do we assume a nerdy, anti-social guy is more likely to be a computer science major than an English major, even though we know that English majors dramatically outnumber computer science majors? Because, once we have information that seems to correspond to a stereotype, System 1 values that association more than any statistical information it would pass on to System 2.

Kahneman concludes that System 2 is who we all like to think we are—the rational manager making conscious judgments based on all the relevant information. And System 1 is who we really are—the highly-efficient assistant making countless snap decisions based on mental shortcuts that work most of the time, i.e. except when they don’t. System 1 gets blamed for all the mistakes, and System 2 takes credit for the all the successes, but in reality, System 2 is responsible for letting the mistakes slip in, and also for making some errors of his own. Kahneman observes wryly that, in principle, it’s easy to avoid mistakes—simply slow down and put in more effort. Unfortunately, in practice, it’s precisely when we should slow down and reflect that we’re most likely to forge ahead on instinct alone.

Why does it make sense when Julia Roberts’ character, Shelby, in Steel Magnolias explains to her mother, played by Sally Field, “I’d rather have five minutes of wonderful than a lifetime of nothing special?” Here, Roberts’ character has a choice between a comfortable, happy life with a husband who loves her but no children, or the risk of carrying a child but damaging her fragile health. System 1 doesn’t track the duration of events but is more interested in the peak of an experience and the way things turn out in the end. System 2 constructs a life story based on major events and moments, so the two systems together favor choices that will produce more intensely positive outcomes. In this case, Shelby chooses to risk the health consequences of pregnancy in order to have the peak experience of motherhood, though she knows that the duration of that experience may be tragically short.

Kahneman dedicates this book to Amos Tversky, his longtime collaborator and good friend. The story of their friendship, which is inextricably linked to the breakthroughs they made together, is a rich, human narrative of the scientific method at work. No one jumps out of a tub shouting “Eureka!” Rather, two friends take long walks, talk about their observations, and discuss hunches about why people make the decisions they do. They figure out ways to test these hunches, proving some and rejecting others along the way. Sometimes, they get caught in intellectual dead ends, held up by related concepts that haven’t been thoroughly explained, or innovative research methods that haven’t yet been developed. In a few moving instances, Kahneman describes how their students, or their students’ students (that is, their intellectual children and grandchildren in academic parlance) overcome these limitations.

Thinking, Fast and Slow is not merely a résumé of Kahneman’s work with Tversky. He also gives credit and attention to the contributions of many scholars who came both before, during, and after his most productive years with Tversky. One particularly interesting section in this vein weighs the relative merits of expert versus public perceptions of risk, and the real-life implications of policies based on each. For example, he writes, “terrorism speaks directly to System 1.” Deaths from terrorism, even in places like Israel, rarely approach the number of deaths in traffic accidents in any given week. The novelty of terrorist attacks, coupled with their powerful narrative and visual impact, guarantee media coverage that make these events loom larger than their statistical likelihood of occurrence would seem to warrant. That accessibility to System 1 makes the public demand protective policies even if the more rational System 2 response would allow experts to make a more judicious response.

Kahneman claims that as he was writing, his imaginary audience consisted of office workers gathering around the water cooler to share opinions and rumors. His ambition was to enhance their understanding of how the objects of their gossip affected their decision-making, and to give them a better vocabulary to talk about it. Despite this deceptively humble objective, Thinking, Fast and Slow is profound in its result.

Sheila Peuchaud is an assistant professor in the Department of Journalism and Mass Communication at the American University in Cairo.