CHAPTER 2

Amplifying Errors

No one would get a Nobel Prize for demonstrating that human beings make mistakes. Yet we have noted that in recent decades, behavioral scientists have done a lot of work to specify how and when people err. And in fact, at least five Nobel Prizes in Economics have been won by scientists associated with behavioral economics; the winners include Daniel McFadden in 2000, George Akerlof in 2001, Daniel Kahneman in 2002, Thomas Schelling in 2005, and Robert Shiller in 2013.

Garbage In? A Brief Guided Tour

The real advances can be found in efforts to specify exactly why and when human beings go wrong—in identifying the mechanisms that lead people to err. We know that people use heuristics, or mental shortcuts, that lead them to make predictable errors. People are also subject to identifiable biases, which produce systematic errors.1 As we have noted, both heuristics and biases can be connected with the family of cognitive operations known as System 1—the intuitive, automatic system that works fast and without a lot of effort. Our main interest here is in group behavior, and in providing a guided tour, we emphasize the heuristics and biases that are most relevant to what groups do, especially in the domains of business and government.

Availability

Human beings err because they use the availability heuristic to answer difficult questions about probability. How likely is a terrorist attack, a hurricane, a traffic jam, an accident from a nuclear power plant, a sexually transmitted disease? How likely is it that a particular product or reform will succeed? That a new film or television show will flop or attract a big audience?

When people use the availability heuristic, they answer a question of probability by asking whether examples come readily to mind. People tend to say that on a random printed page, more words will end with the letters ing than with the letter n in the second-to-last position. This is an obviously incorrect judgment, made because words ending with ing are easy to call to mind.2 Such findings bear on private and public responses to risks—suggesting, for example, that people will be especially responsive to the dangers of crime, earthquakes, and environmental disasters because examples are easy to recall.

In business and government, people often respond to yesterday’s famous failure (or celebrated success). If a particular strategy or approach turned out disastrously, it will be very much in mind, and today’s decision will be made in light of that disaster. No politician wants to be responsible for “another Vietnam” or to be “another Neville Chamberlain.” Foreign policy is sometimes made by reference to available analogies—not the worst approach in the world, but a pretty unreliable one. If a company put a lot of money into a recent flop, the firm is likely to try to avoid a new initiative that looks even a little bit like the flop, even if the new initiative has a lot of promise. The availability heuristic can lead people in bad directions; it’s a lot worse than a careful statistical analysis.

This heuristic also explains some of the sources of discrimination on the basis of race, sex, age, and disability. If it is easy to bring to mind cases in which a female employee quit work to care for her family, sex discrimination is more likely. And whenever a company makes employment-related decisions, there is a risk that the availability heuristic will produce mistakes. Good decisions rely on the full performance record, not on snapshot impressions or on whether a candidate is reminiscent of someone who failed or succeeded. In fact, Michael Lewis’s best seller, Moneyball, is a terrific case study of this point, showing that statistical analysis is a lot better than personal impressions in the evaluation of baseball talent—a finding that applies to the evaluation of talent of all kinds.

In this way, familiarity can affect the availability of instances. But salience is important as well. A terrorist attack shown on television will be vivid and highly salient to viewers and will have a greater impact than a report about the attack in the newspaper.3 The point helps explain a lot of human behavior. For example, whether people will buy insurance for natural disasters is greatly affected by recent experiences.4 In the aftermath of an earthquake, people suddenly become far readier to buy insurance for earthquakes—but their readiness to do so declines steadily from that point on, as vivid memories recede.

Use of the availability heuristic is far from irrational, but it can easily lead to serious errors of fact. After a flood or a hurricane, it is predictable that people will take significant steps to prepare for floods or hurricanes—and also possible that before any low-probability disaster, they will take inadequate precautions or maybe none at all. If companies use the availability heuristic, they may well make significant mistakes, at least if they make excessive generalizations from events that are recent or readily called to mind.

Emphasizing the success of the Oakland Athletics general manager Billy Beane, Lewis’s Moneyball points to the importance, in baseball, of relying not on intuition, emotions, anecdotes, or even experience, but on careful statistical analysis. The “Moneyball” idea is being applied in many domains, including government itself. Decision makers do a lot better playing their own version of Moneyball than they do using the availability heuristic. We might well see Moneyball as a strategy to strengthen the hand of System 2, using it as a safeguard against the mistakes of System 1.

Representativeness

Most people also follow the representativeness heuristic, which means that our judgments about probability are influenced by assessments of resemblance or similarity (the extent to which A looks like B, physically or otherwise).5 You might think that a prospective job candidate “looks like” a good CEO, and this thought might greatly affect your judgment. Some seasoned baseball scouts, for example, tend to focus on whether a prospect looks like a ballplayer. Billy Beane liked to respond, “We’re not selling jeans here.” Rejecting the representativeness heuristic, he examined players’ statistics, not their game films.

The representativeness heuristic is famously exemplified by people’s answers to questions about the likely career of a hypothetical woman named Linda, described as follows: “Linda is thirty-one years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice and also participated in antinuclear demonstrations.”6 People were asked to rank, in order of probability, eight possible futures for Linda. Most of these were fillers (such as psychiatric social worker, elementary school teacher); the two crucial ones were “bank teller” and “bank teller and active in the feminist movement.”

Most people said that Linda was less likely to be a bank teller than to be a bank teller and active in the feminist movement. This is an obvious mistake, a logical fallacy in the form of a conjunction error, in which two characteristics A and B are thought to be more likely than characteristic A alone. The error stems from the representativeness heuristic: Linda’s description seems to match our stereotypes of “bank teller and active in the feminist movement” far better than “bank teller.” The representativeness heuristic helps explain what social scientists Paul Rozin and Carol Nemeroff have called “sympathetic magical thinking,” including the beliefs that some objects (especially disgusting ones) have contagious properties and that causes resemble their effects.7 The representativeness heuristic often works well, but in business and politics, it can also lead to severe blunders. Hiring decisions often go badly wrong because of the use of representativeness.

Relying on representativeness, as in the case of Linda, rather than on a more systematic, rational process can produce stereotyped thinking, again leading to mistakes. Malcolm Gladwell cites the historical example of people’s exaggerated estimation of Warren G. Harding’s capacity for leadership, simply because he looked so “presidential”; his appearance and demeanor matched our prototype of a political leader.8 Other “halo effects,” such as the tendency to vote for candidates who look competent or who are simply taller, again matching our mental image of a prototypical leader, are partly a product of the representativeness heuristic.9

Framing

Human beings are subject to framing effects, which means that their decisions are influenced by how a decision is framed, even though the different frames reflect merely semantic differences rather than reality. By a “frame,” we mean to refer to how a problem is presented to someone, or how people represent or define a decision from their own perspective. The same decision can be thought of in terms of “What is lost?” or “What is gained?” or in terms of “What should I choose?” or “What should I reject?” The key point is that the frame has a big effect on what is ultimately decided. For example, people are more likely to have an operation if they are told that after five years, 90 percent of people are alive than if they are told that after five years, 10 percent of people are dead. A meat product advertised as “90 percent fat-free” is far more appealing than one advertised as “10 percent fat.” If people are told that they will “save $600 per year” if they use energy conservation approaches, they are less likely to use such approaches than if they are told that if they decline to do so, they will “lose $600 per year.”

Perhaps the most interesting manifestation of gain-loss framing effects is a subtle impact on our appetite for risk taking (“are alive” induces the gain frame, “are dead” the loss frame; “90 percent fat-free” is gain, “10 percent fat” is loss). Sometimes, sadly, we do face options that are most obviously seen as losses from our current circumstances. If a decision dilemma is framed in terms of losses for both alternatives, most people shift from a cautious, risk-avoiding attitude to become risk seeking. This is especially true if some of the possible but uncertain outcomes associated with a course of action get us back to the status quo or break-even condition. Most obviously, this point explains some of the crazy bets that gamblers play, in a usually futile effort just to get back to where they started, but there are many socially more consequential examples in medical, business, and government decisions.

We can debate whether and to what extent it is rational for people to be so vulnerable to how problems are framed. But there is no doubt that individuals can be nudged by descriptions of choices, even if the more effective descriptions involve clever uses of words, rather than substantive differences.

Egocentric Bias and More

If you are like most people, you suffer from egocentric bias—the bias that leads most of us to think that other people think and act as we do. We exaggerate the extent to which our tastes and preferences are typical. When asked what percentage of other people go to the movies on Saturday night, love Bob Dylan or Taylor Swift, favor a particular political candidate, or think that the latest Steven Spielberg movie will win the Oscar, most of us show a bias in the direction that we ourselves favor.

This bias leads to major mistakes. For example, people who support any particular candidate typically overestimate the candidate’s chances of success. Even though this has happened before, it will happen again; it’s hard to avoid the bias. The same problem can arise in firms, as people think that if they love a product, other people are going to love it too. Since people who make products often love what they make, they might well end up overestimating its appeal to others.

We have referred to other biases as well, and while each has produced an extensive literature, we will rest content with a few additional notes of special importance to business (but also, as it happens, romantic relationships).10 If the course of an action is failing, people tend to increase their commitment to it, rather than simply cutting their losses. Human beings are prone to unrealistic optimism. About 90 percent of drivers have been found to believe that they are safer than the average driver. With respect to most of the risks that human beings face, people typically think that their prospects are a bit better than reality warrants.

So too, human beings are prone to overconfidence, not least when they make investment decisions. Thus, they exhibit a habit of overprecision, believing that their forecasts are more accurate and precise than in fact they are. What’s your best estimate of the value of a favorite investment at the end of the year? You might say, “$74 per share.” Now, what is your 90 percent confidence interval around that estimate, such that you will be surprised only 10 percent of the time by an actual value outside that interval? If you are a typical investor, you will be surprised far more than 10 percent of the time.11

We have mentioned the planning fallacy, which is closely associated with both optimism and overconfidence.12 Projects small (term papers) and large (the Big Dig tunnel in Boston, the Sochi Olympics site) invariably run over budget and take much longer to complete than the original plans. When companies try to make valid cost estimates, even for routine projects, they often use multipliers of greater than two, but their estimates are still too low. The error arises from a myopic focus on one scenario for the project, which blocks off thoughts about the myriad of contingencies—interruptions from competing commitments, supply-chain failures, unexpected failures by partners, “acts of god” like weather conditions, and so on—and thus produces a coherent, overoptimistic narrative.

Human beings also suffer from hindsight bias, which means that when things turn out a certain way, they tend to think, I knew it all along.13 A real problem with hindsight bias is that it can decrease the likelihood of learning. If I am not surprised by the outcome, because hindsight falsely tells me I already knew what would happen, I will not put any thought into revising my beliefs about the situation. The next time I am confronted with a similar judgment, I repeat my mistake.

Finally, people fall prey to the sunk-cost fallacy, which means that they do not act rationally with respect to costs that they have already paid out.14 If I invest in concert tickets (or health club memberships or a war), I am motivated to stay the course, even when I lose interest or when it is obvious (at least from outside the endeavor) that I am involved in a losing proposition. We will explore some potential remedies for this fallacy, and for others, in part 2.

Garbage Out

For our purposes here, a central and even crucial question is whether groups avoid the errors made by individuals. Often they do not—providing vivid illustrations of the principle “garbage in, garbage out,” in a way that mocks the aspiration to collective correction of individual blunders. In fact, individual errors are not merely replicated but actually amplified in many group decisions—a process of “some garbage in, much garbage out.”

Here is an especially disturbing finding, one with great relevance to group behavior in both business and politics: groups are even more likely than individuals to escalate their commitment to a course of action that is failing—and all the more so if members identify strongly with the groups of which they are a part.15 We know that when things start to go awry, some individuals ramp up their commitment to their original plan, in part because of a strong emotional commitment to making the plan work.

Groups are even worse on this count. There is a clue here to why companies, states, and even nations often continue with projects and plans that are clearly failing. If a company is marketing a product that is selling poorly, it may continue on its misguided course simply because of group dynamics. So too with a nation whose economic policy or approach to foreign affairs is hurting its citizens.

We noted that with respect to the planning fallacy, groups are also worse than individuals—a finding that reflects the risk that groups will be far too optimistic. Groups have also been found to have the following problems:

  • They amplify, rather than weaken, reliance on the representativeness heuristic.16
  • They show more unrealistic overconfidence than do individual group members.17
  • They are even more vulnerable to framing effects than are individuals.18
  • They are more affected by spurious arguments from lawyers.19
  • They are even more susceptible to the sunk-cost fallacy.20

In a revealing finding that bears on the power of the representativeness heuristic, groups have been found to make more, rather than fewer, conjunction errors (the belief that two events together are more likely to occur than either one of the events alone) than do individuals, when individual error rates are high. Conversely, groups make fewer conjunction errors when individual error rates are low.21

Similar biases infect the legal system. Suppose, for example, that individual jurors are biased because of pretrial publicity that misleadingly implicates the defendant or even because of the defendant’s unappealing physical appearance. If so, juries are likely to amplify rather than correct those individual biases.22 This point is not restricted to juries; it has implications for group decisions of all kinds. If individuals in a firm are somehow biased by irrelevant or invalid evidence, the group may well be more biased still.

We don’t want to leave the impression that every judgment bias is amplified by groups. For some biases, groups repeat the individual error, but do not increase it, and for others, they might even decrease it. Compared with individuals, groups have been found to demonstrate a slightly lower level of reliance on the availability heuristic (recall that this heuristic can lead to clear errors).23 And people’s tendency to anchor on salient numbers (“anchoring bias”) is somewhat reduced by group deliberation, as are the hindsight and egocentric biases. The larger point, however, is that individual biases are not systematically corrected at the group level and they often get worse.

Why More Garbage?

Why are individual errors so often amplified and so infrequently corrected at the group level? Return to our two themes: informational pressures and social influences. Both are at work.

Suppose that most members of a group are prone to make specific errors. If the majority makes those errors, then most people will see others making the same errors. What they see will work as “social proof,” conveying information about what is right. Those who are not specialists are likely to think, If most people make the same errors, maybe they are not errors at all. Social influences also play a role. If most group members make errors, other members also might make them simply not to seem disagreeable or foolish. When groups amplify the blunders of their members, social influence is a large reason.

To be sure, there is some good news, in the form of evidence that deliberating groups can correct or reduce certain biases. We have seen that for “eureka” problems, groups do well, even if individual members begin with an answer produced by some kind of bias. This is an important finding. As we will see, it has implications for how groups can do better. If group members recognize that an answer, once announced, is clearly right, they will tend to converge on it. This convergence on the truth can occur for three reasons. First, people sometimes just recognize the answer as right; it is somewhere in their minds, but they haven’t had access to it, so there is a sudden click of recognition. Second, the person who announces the right answer may be able to demonstrate its validity, perhaps with a compelling argument or a logical derivation. And finally, the technical authority of the person is sometimes indisputable. When estimating a fact about traffic, a highway engineer is the authority; about medicine, the physicians are the experts. This raises a nice question: How can groups become more likely to say some version of “eureka”?

Groups will also do better than individuals in overcoming egocentric bias. The reason is straightforward. As an individual, you will focus on your own tastes—on what you like and what you don’t like. Maybe you are wildly excited about a new Captain America movie, or maybe you think that a new cell phone, bright pink and shaped in a circle with Ronald Reagan’s picture on it, is awesome and impossible to resist. If you consult with others, you are likely to find out that your own taste is idiosyncratic. (One of us agrees with you, though, about a new Captain America movie.)

The broader lesson is that if groups are able to benefit from having diverse views, people will quickly learn that their own position is not universally held, and hence the bias is reduced. In these cases, group deliberation supplies an important corrective. Note that we’re less likely to get that corrective if the group consists of like-minded people, where egocentric bias might be amplified.

This is probably why the availability bias is slightly reduced in groups. Maybe the group members all rely on what comes to mind, but the members are likely to have different memories, yielding a more representative sample of salient information at the group level. Or consider the hindsight bias. Compared with individuals, groups are slightly less susceptible to that bias.24 Group members who are not susceptible to hindsight bias may be able to persuade others that it is indeed a bias, or perhaps some members were surprised by the outcome (which reduces the individual hindsight effect).

But the larger point is that with group discussion, individual errors are often propagated and amplified rather than eliminated. When individuals show a high degree of bias, groups are likely to be more biased, not less biased, than their median or average member. Here, then, is a major reason that groups fail.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset