Chapter 2

Your Observers Are Cognitive Misers (and So Are You)

In the 1980s, psychologists Susan Fiske and Shelly Taylor were looking for a way to describe what research was showing to be a ubiquitous tendency among humans: to think only as much as they feel they need to, and no more. And so the metaphor of the cognitive miser was born, with each of us an Ebenezer Scrooge—except instead of sitting on piles of money and refusing to pay for an extra lump of coal to keep the house warm, we sit on reserves of mental energy and processing capacity, unwilling to spend much of it unless we really have to. We rely on simple, efficient thought processes to get the job done—not so much out of laziness (though there is some of that, too), but out of necessity. There is just too much going on, too much to notice, understand, and act on, for us to give every individual and every occurrence our undivided, unbiased attention. So not only are you innately hard to understand, but the people observing you are hoarding their attention.

Human thought, like every other complex process, is subject to the speed-versus-accuracy trade-off. Go fast, and you make mistakes. Be thorough and diligent, and you take an eternity. We are, as Fiske later called us, motivated tacticians—strategically choosing ease and speed, or effort and accuracy, depending on our motivation. Most of the time, just the gist will do, so we choose speed.

The cognitive miser’s favorite shortcut tools are heuristics and assumptions. Heuristics are rules of thumb like “Things that come to mind easily happen more frequently.” In other words, if I ask you, “Does your Uncle Phil lose his temper a lot?” and you can remember a lot of times when your Uncle Phil lost his temper, then you will probably conclude that yes, Phil loses his temper quite often. But if you have a hard time recalling such an instance, you would conclude that Phil is gentle like a lamb. Like most rules of thumb, this heuristic will steer you toward the right answer much of the time. But it can also lead you astray.

Quick—which is more common, getting struck by lightning or getting bitten by a shark? Most people think shark bites are more frequent, when in fact roughly five thousand people in the United States are struck by lightning each year, compared with only ten to fifteen who are attacked by sharks. (On the National Geographic Shark Week website, I also learned the fun fact that in 1996, only thirteen people were injured by sharks, while forty-three thousand were injured by toilets, and twenty-six hundred by room fresheners.)1

Why do we think sharks are a much bigger source of danger than lightning strikes and toilets and room fresheners? Because whenever someone is bitten by a shark, you hear about it on the news. There’s something so primally terrifying about shark attacks (thank you, Steven Spielberg) that it makes for a great infotainment story. When is the last time you saw a story about a lightning victim on the news, or a guy who fell and hit his head on the toilet lid, or . . . I’m honestly not sure how you get injured by a room freshener, but you see my point.

Assumptions, the cognitive miser’s other favorite shortcut, come in many varieties, too. They guide what the perceiver sees, how that information is interpreted, and how it is remembered—forming an integral part of his or her perception of you. In the rest of this chapter, I’ll describe how some of the most powerful and pervasive of these assumptions work.

Confirmation Bias and the Primacy Effect

Perhaps the most prevalent, and most influential, of all the assumptions that guide perception is this: when other people look at you, they see what they expect to see. Psychologists call this confirmation bias.

If people have reason to believe that you are smart, they will see evidence of intelligence in your behavior—whether or not there is any. If they have reason to believe you are dishonest, they will interpret a lack of eye contact or awkward body language as evidence that you have something to hide—as opposed to evidence that you are shy, or distracted, or in gastric distress.

Confirmation bias is shaped by many factors. Stereotypes about the groups to which you belong, your apparent similarity to other people the perceivers know, and culture (yours and theirs) are among the most consequential. And of course, their own past experience with you, if they have any, plays a major role.

That last part seems fairly logical, as far as assumptions go. If you have been gregarious, pessimistic, or hot-headed in the past, it’s reasonable to think you are likely to continue to be so in the future and to interpret your behavior accordingly. If you say something that could be considered offensive or humorous, and I know you to be a jokester, then I’m more likely to go with the latter interpretation and to see the humor in your off-color remark. My past experience with you helps me to make the right call.

The problem, however, is that our early impressions of a person can hold far too much weight and can lead us astray when they paint an inaccurate picture. Psychologists refer to this as the primacy effect—that the information we get about a person early in our observation of him or her influences how we interpret and remember later information.

Imagine two children—each taking a thirty-question math test. On the first half, Timmy gets fourteen out of fifteen correct, while Charlotte gets only six. In the second half, the scores reverse—with Charlotte getting fourteen, and Timmy only six. Objectively, these two children have both performed at the same level—getting a total of twenty out of thirty problems correct. So rationally, anyone watching would conclude that they have the same level of mastery in math, right?

Only that’s not what happens—not even close. In study after study, researchers find that Timmy is perceived—even by experts like math teachers—to be the more talented of the two.2 This is because performance on the first half of the test exerts a far greater influence on judgment than performance on the second. In essence, when the test is only halfway finished, the perceiver has already concluded that Timmy is smart and Charlotte is not. What happens afterward does precious little to alter those initial impressions.

The implications of findings like these for late bloomers, or anyone who struggles initially only to excel later, are terrifying. It’s not impossible to change these initial impressions (more on that later), but it is really difficult. Charlotte would have to present overwhelming evidence of her math ability in order to override it, while Timmy can happily coast on his early success for quite a while. The problem for Charlotte is that she may not even be given the chance to override that impression if she is placed in a remedial math track or discouraged from pursuing math altogether. Think of all the promising young actors whose first roles in terrible movies cost them a future in acting, and all those whose early successes protected them from paying the price for appearing in real stinkers. (She Devil, Meryl Streep? Really?)

The primacy effect is the reason your parents still treat you like you are twelve even when you are forty. In their eyes, you are still the person they first knew you to be—naive, inexperienced, and more than a little foolish. My mother still insists that I am disorganized and scatterbrained, despite the fact that I literally make my living writing and speaking about planning and time management. She constantly tells me that I should “learn to write things down.” Sigh.

The primacy effect is also almost entirely responsible for the fact that sometimes, we can do no wrong in someone else’s eyes, while at other times, we seem to be screwed no matter what we do—or as I like to call it, The Ballad of Ben Stiller in Every Ben Stiller Movie.

Poor Ben Stiller. The characters he portrays on film are generally well-intentioned, decent guys who make a bad impression in the first five minutes of the film and spend the last eighty-five minutes trying to undo the damage, with little to no success. Meet the Parents, Night at the Museum, There’s Something About Mary, Tropic Thunder: in each, Stiller plays a man who has made some mistakes—mistakes he is really embarrassed about. Others see him as a liar, a loser, an idiot, a talentless hack. He tries again and again to make everyone see that that’s not who he really is, but seemingly no matter what he does, it’s viewed through the lens of his past behavior.

While most of us are not making bad first impressions on such a spectacular scale (thankfully), we are all subject to this kind of bias. People who know you—particularly those who know you well—will tend to see you the way they have always seen you. One of my favorite research examples of the primacy effect comes from a study that recruited pairs of close friends. Each friend was asked to describe the other’s personality (privately) on a long list of dimensions—like funny, smart, creative, assertive, and so forth. Randomly, one friend was then selected to be the target, and the other the evaluator. Each target was asked to perform four tasks that the evaluator later watched via video.

The first task was ordinary enough—a game of general knowledge trivia, with difficult questions like “How high is Mount Everest?” and “How many people live in Tokyo?” The rest reads like an episode of Whose Line Is It Anyway? A role-play task required the target to call a “neighbor” (actually a member of the research team) and demand that they turn down the volume of their stereo. Next, the target was asked to spontaneously tell a brief story, involving the words corkscrew, holiday, catastrophe, and glove box. (Go ahead. Try it.) Last, the target sang a song of his or her choosing, told a favorite joke, and, in a final act of humiliation, had to pantomime the word party. I’m not sure what the participants in this study were paid, but whatever it was, it wasn’t enough.

The evaluator (or the lucky one) was then asked to rate his or her friend’s performance on these four tasks on a list of dimensions: How intelligent was the behavior? How funny? How creative? Compared with unbiased observers (i.e., strangers), ratings of the friend’s behavior were dramatically skewed, almost uniformly reflecting their prior opinion of the friend, rather than the actual behavior displayed. In other words, even if Harry’s joke was just awful or if he scored miserably on the test of trivia, Bob still thought the joke was funny if he thought of Harry as a funny guy, and Bob blamed the test (rather than Bob) for the poor performance if he thought of Bob as smart.

This, in a nutshell, is why it is so hard to get people to revise their opinions of their friends and lovers, their coworkers, or their employees. The perceivers aren’t necessarily being stubborn or willfully putting blinders on—they just don’t see what you see, because the assumptions guiding their perception are very different from the ones guiding yours. Something clearly inconsistent with someone’s existing view of you might get some notice, but anything only moderately inconsistent with a person’s existing view of you gets ignored or reinterpreted to provide a better fit. This is one of the reasons that first impressions are so important to get right and are so resistant to change after they have been formed—and that when you are looking to change someone’s opinion of you, you are going to have to go big or go home.

Stereotypes

Under most circumstances, people use stereotypes about the groups to which you belong (or appear to belong) to interpret everything you do and say. And most of the time, people don’t actually know they are doing it. In fact, they don’t even have to believe a stereotype to be affected by it.

At its most basic, stereotyping is a form of categorization—something human brains have evolved to do swiftly and automatically. Categorization allows us to navigate and interact with new objects in the world with relative ease. You walk into a room you have never been in before, and you know immediately that the thing next to the table—the one with four legs and a horizontal square on top, that isn’t moving and appears to be made of wood—is a chair. Imagine what a colossal pain it would be if every time you encountered a new chair, or apple, dog, or tree, you had to figure out everything about it from scratch. Like a visitor to an alien planet, every object would seem new, strange, and possibly dangerous. But instead, once you’ve figured out the basics of how chairs look and what they do, every new chair you encounter is a no-brainer. Even if you’ve never seen that particular chair before in your life, you know it’s for sitting, rather than for eating or petting or climbing. You are, in fact, a world-class identifier of chairs, cars, rocks, fish, and all sorts of other things—you can do this at a glance because your brain was designed to do it.

And you have beliefs about these different categories of things, too. You believe, for instance, that rocks are hard, cars run on gasoline, and fish swim. These beliefs aren’t right 100 percent of the time—a dead fish doesn’t swim, and my high school boyfriend’s car seemed to rarely run at all—but they are useful guidelines for knowing what kind of behavior to expect from a particular thing and how to interact with it.

Stereotypes are the beliefs we have about categories of people, and we categorize people in lots of ways: by gender, race, sexual orientation, ethnicity, profession, and socioeconomic class. Some of the beliefs associated with these categories are positive, such as Asians are good at math or firefighters are brave. Others are decidedly less so (e.g., redheads are hot-tempered; women are weak; poor people are lazy).

We have beliefs about categories as a result of other kinds of differences, too. Our hobbies, interests, and abilities (or lack thereof) can form the basis of stereotypes. Science-fiction fans are brainy, socially challenged, and allergic to the outdoors. Hipsters like beards, irony, and making their own pickles. Environmentalists are liberal and uptight and may also enjoy making their own (organic, locally sourced) pickles.

We even categorize people according to patterns in their facial features—and the consequences of it can be astounding. For instance, baby-faced people—those who have large eyes; thinner, higher eyebrows; large foreheads; and small chins on a rounded face—are perceived to be more innocent and consequently more trustworthy than mature-faced people. I suppose this isn’t surprising, since baby-faced people remind us of babies—beings who are practically synonymous with innocence. The problem, obviously, is that while actual babies are less likely to do intentional harm, there’s nothing keeping baby-faced adults from doing so. How does this stereotype affect the odds that these adults will be punished when they do intentional harm?

Researchers examining the results of over five hundred small-claims court cases found that differences in baby-facedness had a huge impact on whether the defendant was found guilty. For claims of intentional harm (e.g., a neighbor deliberately crashed a car into another’s fence after a heated argument), the most mature-faced defendants had a 92 percent chance of being found guilty, compared with only a 45 percent chance among the most baby-faced. But when it comes to negligent harm (e.g., a neighbor wasn’t looking and accidentally backed into the other’s fence), the baby-faced were more likely to be found guilty (85 percent) than the mature-faced (58 percent).3 In other words, if someone with a delightfully babyish face, like Jennifer Lawrence, Leonardo DiCaprio, or a young Mark Hamill, ran over your begonias, you’d be likely to think he or she was just distracted by a frolicking puppy or a happy song on the radio. But when Clint Eastwood runs over your begonias, you’re pretty sure he’s doing it on purpose. We are very comfortable saying that baby-faced people are screwups, but uncomfortable thinking they are deliberately bad. On the other hand, a mature-faced person is capable of malicious deeds, but seems less likely to be a bonehead. Being mature-faced gets you taken seriously, even if you aren’t fully trusted. It’s a trade-off.

We tend to think of stereotypes as inherently negative, but that’s not really the case. Just as baby-faced people appear more trustworthy, Asians are believed to be more skilled in math and science, women are assumed to be more caring and nurturing, and blacks more athletically gifted. The contents of a stereotype can have either positive or negative implications for how you will be seen by others. And the more typical of a particular group you seem to be—the more you match others’ ideas of what a member of a group should look, sound, and act like—the more strongly the stereotype will be applied to you.

The reality is, stereotypes can be used to your advantage. And on some level we understand this intuitively, when we dress for success or try to fit in. When I applied for my first real job at Bell Labs while still in college, I went to the interview in a suit with my hair pulled back neatly and a minimum of makeup. I wanted to look like a typical Bell Labs scientist so that the company would assume I had the other stereotypical traits of that particular group: intelligence, seriousness, discipline. If I had gone in looking like a typical college student—wearing a flannel shirt, shorts, and a baseball cap (it was the early 1990s)—I would have activated an entirely different and conflicting stereotype: immaturity and inexperience.

In my case, it was clear which group I wanted to appear to belong to. But there are, unfortunately, instances where navigating these waters isn’t so easy—when you want to convey qualities that don’t coexist nicely in a single stereotype.

Imagine two candidates being interviewed for a leadership position in your company. Both have strong résumés, but while one seems to be bursting with new and daring ideas, the other comes across as decidedly less creative (though clearly still a smart cookie). Who do you think would get the job? And just as important, who should?

The answer to the question of who gets the leadership job is usually the less creative candidate. Why? After all, creativity—the ability to generate new and innovative solutions to problems—is obviously an important attribute for any successful business leader. Research shows that leaders who are more creative are better able to effect positive change in their organizations and to inspire others to follow their lead.4

The problem, put simply, is this: our idea of what a typical creative person is like is completely at odds with our idea of a typical effective leader. Creative types like designers, musicians, and writers are (stereotypically) nonconformist and unorthodox—not the sort of people you usually put in charge of large organizations. Effective leaders, it would seem, should provide order, rather than tossing it out the window.

Because we unconsciously assume that someone who is creative can’t be a good leader, any evidence of creativity can diminish a candidate’s perceived leadership potential. For instance, a study in which fifty-five employees rated the responses of nearly three hundred of their (unidentified) coworkers to a problem-solving task for both creativity (the extent to which their ideas were novel and useful) and as evidence of leadership potential found that creativity and leadership potential were strongly negatively correlated. The more creative the response, the less effective a leader the responder appeared.5

In another study, participants were told to generate an answer to the question “What could airlines do to obtain more revenue from passengers?” and pitch their ideas in ten minutes to an evaluator. Half the participants were asked to give creative answers (both novel and useful, e.g., “offer in-flight gambling with other passengers”), while the other half were told to give useful but non-novel answers (e.g., “charge for in-flight meals”). The evaluators, unaware of the different instructions, rated participants who gave creative answers as having significantly less leadership ability.

Even though creativity is a much-admired quality, perhaps more so today than ever before, there is a very clear, unconscious bias against creativity when it comes to deciding who gets to be in the driver’s seat, thanks to stereotyping. And because of the bias, organizations, believing they are picking people with clear leadership potential, may inadvertently assign leadership positions to people who lack creativity and will preserve only the status quo, believing they are picking people with clear leadership potential.

The Halo Effect

Do you think that someone who is physically attractive is more likely to also be intelligent, honest, creative, or kind? Of course not, you say. There’s no reason for those things to go together. Well, that’s perfectly true—but your cognitive miser sees it differently. The tendency to assume that someone possesses other positive qualities from the presence of a single, powerful positive quality is called the halo effect. And, aside from first impressions’ resistance to change, the halo effect provides yet another reason why first impressions are so important.

If you are handsome or charming, people will assume you are probably smart and trustworthy, too. And in a kind of reverse-halo (a pitchfork effect, perhaps?), if you are unattractive or charmless, people will assume you are dull and dishonest as well. Perhaps my favorite research example of this pervasive phenomenon is a study conducted a few years after President Ronald Reagan left office. Psychologists asked people to guess what Reagan’s grade point average (GPA) was when he was an undergraduate at Eureka College—something the vast majority of people would have no way of actually knowing. The researchers found that the participants who had liked Reagan thought that he had an A average, while those who had disliked him thought he had a C average. And the more strongly they had liked or disliked him, the more certain they were about his GPA. They knew it to be true. (Incidentally, he had a C average. That’s neither here nor there, but I thought you might be curious.)6

Halo effects are strengthened by another largely unconscious process—namely, that holding contradictory views of someone (for instance, believing that John is a good person, while knowing at the same time that John cheats on his taxes) causes a psychological pain called cognitive dissonance. When asked to put it into words, people describe it as a kind of nagging discomfort or a state of tension. The only way to resolve the dissonance and get rid of the discomfort is to change one of the conflicting views (i.e., choosing to ignore the fact that John cheats on his taxes, or deciding that he is not a good person). So it’s just easier to believe that people who have one positive quality have lots of other ones, because there’s no risk of creating dissonance. The engine of your mind keeps running smoothly.

The False-Consensus Effect

There’s another simple assumption we unconsciously use to make things easier on ourselves: Other people think and feel what I think and feel. It’s hard for other people to know what you’re thinking or feeling. They have to search for clues in your words and actions, carefully considering them and their context. They have to try to take your perspective, rather than their own. It takes a lot of work to get it right. But a lot of work is the last thing the cognitive miser wants to do.

Psychologists call this tendency to believe that others feel the way we do the false-consensus effect, and evidence for it is all around you.7 Ever wonder why people in minority extremist political groups are always acting as if they speak for “the American people”? It’s because they genuinely believe that they do—they assume other people agree with them about how the country should be run, and they don’t bother to pay for the polling that would tell them otherwise. Shy people think shyness is more common than it is. People who are prone to depression, or have an optimistic outlook, or sweat easily even on cool days, think that most other people do, too. When it comes to everything from religious views to favorite flavors of ice cream, people assume you see things as they do. Because, why wouldn’t you?

We also have a tendency to think our bad habits and flaws are universal—that they are, in fact, quite normal. For example, people who are quick to lose their tempers, who cheat on their taxes (or their spouses), or who smoke, drink, or take drugs overestimate the frequency with which others give in to these temptations, too. Everybody does it, we think. I’m not so different.

But when it comes to goodness, it’s a very different story—because we each tend to believe that we have better values and are generally more honest, kind, and capable than others are. Psychologists call this assumption false uniqueness. A great example of this tendency can be found in the work of Chip Heath, Stanford University psychologist and author of Made to Stick. He showed that while most of us rank intrinsically motivating factors—such as skill development—as the most important to us in our careers, we believe that other people care primarily about extrinsic motivators, like compensation. In other words, we believe that when it comes to work we do, our own values are more noble and authentic than those of our colleagues.

Heath and his team gave University of Chicago MBAs the opportunity to rank eight possible career motivations in terms of their personal importance and then to predict how others would rank them—specifically, how customer service representatives at a specific unit at Citibank would rank them.8 Finally, the team asked the Citibank reps themselves to rank their own motivations.

The MBAs rated “learning new things,” “developing skills,” and “feeling good about myself” as their top three motivators, with “pay” coming in fourth. What did they predict for Citibank reps? That the reps’ top three motivators would all be extrinsic: pay, job security, and benefits. Ironically, the Citibank reps didn’t even include pay in their top four—it came in a distant seventh. They had the same top three motivators as the MBAs, and their fourth was “accomplishing something worthwhile,” another intrinsic motivator. Here’s another illuminating way of looking at the results from this study: the MBAs listed an extrinsic motivator as their own number one motivator only 22 percent of the time, but listed it as others’ number one motivator roughly 85 percent of the time. And the cost of this particular assumption of false uniqueness? A workforce that is routinely undermotivated and misguidedly incentivized.

My favorite example of false uniqueness—the one I’ve taught in my undergraduate classes for years—comes from a national survey conducted in the 1980s.9 The respondents were asked whether they themselves obeyed each of the Ten Commandments and were then asked to estimate the percentage of Americans who did the same (table 2-1).

TABLE 2-1


Us versus them: how we perceive our own and others’ adherence to the Ten Commandments

CommandmentI doOthers do

Do not curse or use profanity64%15%
Go to church, synagogue, or mosque on holy days64%22%
Respect your parents95%49%
Do not commit murder91%71%
If married, do not have a sexual relationship with someone other than your spouse86%45%
Do not steal90%54%
Do not say things that aren’t true about another person88%33%
Do not envy the things another person has76%23%
Do not covet another person’s husband or wife84%42%
Worship only the one true God81%49%

These results are mind-blowing for a number of reasons. (For example, apparently one in ten Americans either have committed a murder or aren’t sure if they have.) But what’s abundantly clear is that Americans in general have a pretty poor opinion of the morality of their compatriots. There are, if these numbers are correct, only a handful of good men and women adrift in a sea of lying, cheating, envious, false-God-worshipping thieves who curse like truckers at their own parents.

. . .

To summarize, there are some assumptions so universal and automatic that you can count on other people to make them about you (and you can count on people to have no idea that they are doing it):

  • You are who people expect you to be, in light of their past experience with you.
  • The first impression you give is the “right” one, and it shapes how everything else about you is perceived.
  • You are like the other members of groups to which you appear to belong.
  • If you have a very positive trait—if you are smart, beautiful, funny, kind, and so forth—you are likely to have other positive traits.
  • You share the opinions, feelings, and foibles of the perceiver, but not necessarily his or her ethical standards and abilities.

You are never really starting from scratch with another person, even when you are meeting him or her for the first time. The perceiver’s brain is rapidly filling in details about you—many before you have even spoken a word. Knowing this gives you a sense of what you’ve got going for you and what you might be up against. And the more you can know in advance about your perceiver’s likes, dislikes, strengths, and weaknesses, the better equipped you will be to anticipate what’s being projected onto you.

You don’t have to take all of this passively. For example, you can deliberately emphasize your group memberships or your good qualities, to benefit from positive stereotypes and halo effects. You can take pains to make the best possible impression right out of the gate, to use the primacy effect to your maximum advantage. You can make your opinions and values explicitly known. When you have made the wrong impression or have changed in ways you want the people who know you to notice, you can use strategies that will get them to update their beliefs about you (more on this in chapter 9). But however you choose to use the information, it’s essential to start by knowing where you probably stand. And since these assumptions are always in play, they are an integral part of that knowledge.

Fortunately, as wired as we are to jump to half-baked conclusions based on stereotype-riddled first impressions, we are also wired to correct those impressions—when it is worth our while to do so.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset