DECISIVE: How to Make Better Choices in Life and Work

Chip Heath & Dan Heath

336 pages, Crown Business, 2013

Buy the book »

Shannon, the head of a small consulting firm, is agonizing about whether to fire Clive, her IT director. Over the past year, Clive has consistently failed to do more than the minimum required of him. He’s not without his talents—he’s intelligent and has a knack for improvising cheap solutions to technical problems—but he rarely takes any initiative. Worse, his attitude is poor. In meetings, he is often critical of other people’s ideas, sometimes caustically so.

Unfortunately, losing Clive would cause problems in the short-term. He understands how to maintain the company’s database of clients better than anyone else.

What would you advise her to do? Should she fire him or not?

IF YOU REFLECT ON the past few seconds of your mental activity, what’s astonishing is how quickly your opinions started to form. Most of us, reflecting on the Clive situation, feel like we already know enough to start offering advice. Maybe you’d advise Shannon to fire Clive, or maybe you’d encourage her to give him another chance. But chances are you didn’t feel flummoxed.

“A remarkable aspect of your mental life is that you are rarely stumped,” said Daniel Kahneman, a psychologist who won the Nobel Prize in economics for his research on the way that people’s decisions depart from the strict rationality assumed by economists. In his fascinating book, Thinking, Fast and Slow, he describes the ease with which we draw conclusions: “The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way. You like or dislike people long before you know much about them; you trust or distrust strangers without knowing why; you feel that an enterprise is bound to succeed without analyzing it.”

Kahneman says that we are quick to jump to conclusions because we give too much weight to the information that’s right in front of us, while failing to consider the information that’s just off stage. He called this tendency “what you see is all there is.” In keeping with Kahneman’s visual metaphor, we’ll refer to this tendency as a “spotlight” effect. (Think of the way a spotlight in a theater directs our attention; what’s inside the spotlight is crisply illuminated.)

The Clive situation above is an example of the spotlight effect. When we’re offered information about Clive—he does only the bare minimum, he doesn’t take initiative, he has a poor attitude, and his boss might fire him—we find it very easy to take that readily available set of information and start drawing conclusions from it.

But of course a spotlight only lights a spot. Everything outside it is obscured. So, in Clive’s situation, we don’t immediately think to ask a lot of obvious questions. For instance, rather than fire Clive, why not change his role to match up better with his strengths? (After all, he’s good at improvising cheap solutions.) Or maybe Clive could be matched with a mentor who’d help him set more ambitious goals and deliver less scathing criticism.

Furthermore, what if we dug deeper and discovered that Clive’s colleagues adore his crusty, straight-talking ways? (Maybe he’s the IT version of Dr. House.) And what makes us think that Shannon’s take on Clive is impeccably accurate? What if she is a terrible manager? When we begin shifting the spotlight from side to side, the situation starts to look very different. We couldn’t possibly hope to make a good decision about Clive without doing this spotlight shifting. Yet developing an opinion was easy without doing it.

And that, in essence, is the core difficulty of decision making: What’s in the spotlight will rarely be everything we need to make a good decision but we won’t always remember to shift the light. Sometimes, in fact, we’ll forget there’s a spotlight at all, dwelling so long in the tiny circle of light that we forget there’s a broader landscape beyond it.

IF YOU STUDY THE kinds of decisions people make and the outcomes of those decisions, you’ll find that humanity does not have a particularly impressive track record.

Career choices, for instance, are often abandoned or regretted. An American Bar Association survey found that 44% of lawyers would recommend that a young person not pursue a career in law. A study of 20,000 executive searches found that 40% of senior-level hires “are pushed out, fail or quit within 18 months.” More than half of teachers quit their jobs within four years. In fact, one study in Philadelphia schools found that a teacher was almost two times more likely to drop out than a student.

Business decisions are frequently flawed. One study of corporate mergers and acquisitions—some of the highest-stakes decisions executives make—showed that 83% failed to create any value for shareholders. When another research team asked 2,207 executives to evaluate decisions in their organizations, 60% of the executives reported that bad decisions were about as frequent as good ones.

On the personal front we’re not much better. People don’t save enough for retirement, and when they do save, they consistently erode their own stock portfolios by buying high and selling low. Young people start relationships with people who are bad for them. Middle-aged people let work interfere with their family lives. The elderly wonder why they didn’t take more time to smell the roses when they were younger.

Why do we have such a hard time making good choices? In recent years, many fascinating books and articles have addressed this question, exploring the problems with our decision making. The biases. The irrationality. When it comes to making decisions, it’s clear that our brains are flawed instruments. But less attention has been paid to another compelling question: Given that we’re wired to act foolishly sometimes, how can we do better?*

Sometimes we are given the advice to trust our guts when we make important decisions. Unfortunately, our guts are full of questionable advice. Consider the Ultimate Red Velvet Cheesecake at the Cheesecake Factory, a truly delicious dessert—and one that clocks in at 1,540 calories, which is the equivalent of three McDonald’s double cheeseburgers plus a pack of Skittles. This is something that you are supposed to eat after you are finished with your real meal.

The Ultimate Red Velvet Cheesecake is exactly the kind of thing that our guts get excited about. Yet no one would mistake this guidance for wisdom. Certainly no one has ever thoughtfully plotted out a meal plan and concluded, I gotta add more cheesecake.

Nor are our guts any better on big decisions. On October 10, 1975, Liz Taylor and Richard Burton celebrated the happy occasion of their wedding. Taylor was on her sixth marriage, Burton on his third. Samuel Johnson once described a second marriage as the “triumph of hope over experience.” But given Taylor and Burton’s track record their union represented something grander: the triumph of hope over a mountain of empirical evidence. (The marriage lasted 10 months.)

Often our guts can’t make up their minds at all: an estimated 61,535 tattoos were reversed in the United States in 2009. A British study of more than 3,000 people found that 88% of New Year’s resolutions are broken, including 68% of resolutions merely to “enjoy life more.” Quarterback Brett Favre retired, then unretired, then retired. At press time he is retired.

If we can’t trust our guts, then what can we trust? Many business-people put their faith in careful analysis. To test this faith, two researchers, Dan Lovallo, a professor at the University of Sydney, and Olivier Sibony, a director of McKinsey & Company, investigated 1,048 business decisions over five years, tracking both the ways the decisions were made and the subsequent outcomes in terms of revenues, profits, and market share. The decisions were important ones, such as whether or not to launch a new product or service, change the structure of the organization, enter a new country, or acquire another firm.

The researchers found that in making most of the decisions, the teams had conducted rigorous analysis. They’d compiled thorough financial models and assessed how investors might react to their plans.

Beyond the analysis, Lovallo and Sibony also asked the teams about their decision process—the softer, less analytical side of the decisions. Had the team explicitly discussed what was still uncertain about the decision? Did they include perspectives that contradicted the senior executive’s point of view? Did they elicit participation from a range of people who had different views of the decision?

When the researchers compared whether process or analysis was more important in producing good decisions—those that increased revenues, profits, and market share—they found that “process mattered more than analysis—by a factor of six.” Often a good process led to better analysis—for instance, by ferreting out faulty logic. But the reverse was not true: “Superb analysis is useless unless the decision process gives it a fair hearing.”

To illustrate the weakness of the decision-making process in most organizations, Sibony drew an analogy to the legal system:

    Imagine walking into a courtroom where the trial consists of a prosecutor presenting PowerPoint slides. In 20 pretty compelling charts, he demonstrates why the defendant is guilty. The judge then challenges some of the facts of the presentation, but the prosecutor has a good answer to every objection. So the judge decides, and the accused man is sentenced. That wouldn’t be due process, right? So if you would find this process shocking in a courtroom, why is it acceptable when you make an investment decision? Now of course, this is an oversimplification, but this process is essentially the one most companies follow to make a decision. They have a team arguing only one side of the case. The team has a choice of what points it wants to make and what way it wants to make them. And it falls to the final decision maker to be both the challenger and the ultimate judge. Building a good decision-making process is largely ensuring that these flaws don’t happen.

Dan Lovallo says that when he talks about process with corporate leaders, they are skeptical. “They tend not to believe that the soft stuff matters more than the hard stuff,” he said. “They don’t spend very much time on it. Everybody thinks they know how to do this stuff .” But the ones who do pay attention reap the rewards: A better decision process substantially improves the results of the decisions, as well as the financial returns associated with them.

The discipline exhibited by good corporate decision makers—exploring alternative points of view, recognizing uncertainty, searching for evidence that contradicts their beliefs—can help us in our families and friendships as well. A solid process isn’t just good for business; it’s good for our lives.

Why a process? Because understanding our shortcomings is not enough to fix them. Does knowing you’re nearsighted help you see better? Or does knowing that you have a bad temper squelch it? Similarly, it’s hard to correct a bias in our mental processes just by being aware of it.

Most of us rarely use a “process” for thinking through important decisions, like whether to fire Clive, or whether to relocate for a new job, or how to handle our frail, elderly parents. The only decision-making process in wide circulation is the pros-and-cons list. The advantage of this approach is that it’s deliberative. Rather than jump to conclusions about Clive, for example, we’d hunt for both positive and negative factors—pushing the spotlight around—until we felt ready to make a decision.

What you may not know is that the pros-and-cons list has a proud historical pedigree. In 1772, Benjamin Franklin was asked for advice by a colleague who’d been offered an unusual job opportunity. Franklin replied in a letter that, given his lack of knowledge of the situation, he couldn’t offer advice on whether or not to take the job. But he did suggest a process the colleague could use to make his own decision. Franklin said that his approach was “to divide half a sheet of paper by a line into two columns, writing over the one Pro and over the other Con.” During the next three or four days, Franklin said, he’d add factors to the two columns as they occurred to him. Then, he said:

    When I have thus got them all together in one view, I endeavour to estimate their respective weights; and where I find two, one on each side, that seem equal, I strike them both out: If I find a reason Pro equal to some two reasons Con, I strike out the three. If I judge some two reasons Con equal to some three reasons Pro, I strike out the five; and thus proceeding I find at length where the balance lies; and if after a day or two of farther consideration nothing new that is of importance occurs on either side, I come to a determination accordingly. [Capitalization modernized.]

Franklin called this technique “moral algebra.” Over 200 years after he wrote this letter, his approach is still, broadly speaking, the approach people use when they make decisions (that is, when they’re not trusting their guts). We may not follow Franklin’s advice about crossing off pros and cons of similar weight, but we embrace the gist of the process. When we’re presented with a choice, we compare the pros and cons of our options, and then we pick the one that seems the most favorable.

The pros-and-cons approach is familiar. It is commonsensical. And it is also profoundly flawed.

Research in psychology over the last 40 years has identified a set of biases in our thinking that doom the pros-and-cons model of decision making. If we aspire to make better choices, then we must learn how these biases work and how to fight them (with something more potent than a list of pros and cons).