CHAPTER 2

Outsourced Thinking

Our constant struggle to optimize often leads us to recognize the costs and difficulties of trying to know everything. So what do we do? We run headlong into the arms of experts, technologies, and protocols (or rules) that offer the possibility of salvation. Expert individuals, like doctors and lawyers, are people to whom we often defer because of their domain knowledge. And the embedded expertise in technologies allows algorithms to guide our thinking just as a person might, albeit in less noticeable ways. Lastly, there is extended expertise, by which rules are used to control the thinking and actions of those in organizations. Let’s explore each of these three forms of expertise and how they can impact our lives.

Experts

It is often easier to let people—experts, for example—act as viewfinders, focusing our attention on what they think matters. These individuals are granted authority through degrees and licenses—think doctors with MDs, pilots with certified pilot licenses (CPLs)—or sheer reputation. By outsourcing our focus to them, we willfully let them take control over our field of vision, blind to what they leave out. We let them frame our decisions. Many of us don’t even think about it this way, and we mindlessly follow their guidance. At the very least, we should think about, and ideally ask about, what other variables might be worth considering.

Draft Pick #199

Tom Brady, for years the superstar quarterback of the New England Patriots, is often called the greatest quarterback of all time by sports writers and commentators. He has played in numerous Super Bowls, won the league MVP and Super Bowl MVP awards several times, and has broken lots of NFL records. My kids grew up watching Sunday football as a sporadic but enjoyable pastime, but when Brady was on, they almost always watched. And in the ultimate testament to his success, they grew up expecting to see Brady in Super Bowl each year—and for most years, they were right.

But the Tom Brady we know almost wasn’t.

All manner of talent-identifying experts decided Tom Brady didn’t have what it takes to succeed in the National Football League. Countless coaches, assistants, scouts, and sports media analysts passed him over. The result: he was the 199th pick in the NFL draft the year the New England Patriots signed him. Just for reference, most of the attention during a draft is paid to the top twenty-five picks, as they tend to represent the most promising talent entering the league that year. Brady sat on the bench for a year until a teammate’s injury gave him the chance to play. That simple event led to his meteoric rise to one of professional football’s most successful and accomplished athletes.

Brady had been passed over by lots of teams, teams run by professionals who should have been able to identify the upside offered by recruiting Brady. How did they miss his potential? Might it have been too much focus on expert opinion? In fact, throughout the sports field, overrelying on expert judgments is a widespread problem, as many teams defer their assessment of an athlete’s potential to seemingly objective data from algorithms and professional scouts.

The NFL Combine is a set of times and scored physical drills that attempts to “put players on an even playing field by standardizing the drills and measurements taken.”1 They measure an athlete’s physical attributes such as height, weight, hand size, and arm length. This data is then combined with performance data: an athlete’s speed is measured via a forty-yard dash, jumping ability via a vertical jump, and so on. Brady was unremarkable in a physical sense. When it came to his performance stats, however, he definitely stood out. He had the shortest vertical jump and was the second slowest in the forty-yard dash of the class of 2000.2

Experts then combine an athlete’s results and summarize the scores on a scale of 0 to 100; Tom Brady was given a score of forty-five for his size, zero for his speed, thirty-three for his agility, and nineteen for his quickness. The professional judges, who were tasked with gauging a player’s potential in the NFL, seemed to agree: Brady didn’t have what it took to succeed in the NFL. In aggregate, Brady’s overall rating, also out of a possible one hundred, was twelve. The grade put on his final NFL Combine report was an F.3 Oops.

The Goblet of Rejection

J. K. Rowling’s series of stories about the magical world of Harry Potter have sold hundreds of millions of books, grossed billions at the box office, and are the inspiration for several popular theme parks. Quite simply, the Harry Potter series became the most lucrative stories in the history of human communication. Most estimates suggest the revenues associated with the series top $20 billion, a sum that has made the once homeless author a billionaire.

But the stories of Harry, Ronald, Hermione, Lord Voldemort, and others may never have seen the light of day. Why’s that? Because all the experts to whom the story was first sent rejected it as too long, noting that children wouldn’t have the attention span to finish it. Rowling collected dozens of rejections before seeking the help of literary agent Christopher Little, a name she chose because it sounded like a children’s book character. But while composing a rejection letter to Rowling, an assistant was struck by the illustrations and spurred the agency to take on the author

and then publishers began sending the rejection letters to Little rather than Rowling. It was only when Little got desperate enough to ask Bloomsbury Publishing Chairman Nigel Newton for a personal favor that things started looking up. As noted by Aren Wilborn in Cracked, a multiplatform satire brand, “Newton did something that apparently never occurred to other children’s book publishers, which was to show it to an actual child.”4 Newton handed the transcript to his eight-year-old daughter who read it within hours and demanded more. Sensing an opportunity, he offered Rowling a nominal advance and set the initial run for five hundred copies—not exactly a ringing endorsement of expert judgment that foretells insight into one of the best-selling book series of all time.

The rest, as they say, is history. While the achievement of the series is indeed remarkable, it’s also stunning that such blockbuster success was missed by almost every expert judge. One might have expected literary experts (those skilled and experienced in identifying and nurturing talent) would have seen potential in the story and developed and promoted it, even if not efficiently or quickly. Not so.

Technology

In addition to people, we also outsource our thinking to technology. Often, the worst consequences of our overreliance on technology products arise from rigid rules embedded in them. The problem is that technological systems can’t think for themselves. A computer system can’t doublecheck whether something makes sense.

Even when we’re not conscious of it, technologies constantly determine what passes into our field of view. They frame our decisions for us. Sure, I could search for a second opinion. But it would take up precious time that most of us don’t have. Who even goes to the second page of a Google search? Ninety percent of clicks from searches come from the first page. Five percent come from page 2, and only 1 percent from page 3.5 Thirty-three percent of searchers don’t make it past the first result! Time considerations aside, the fundamental problem is that we often forget the subversive process of focus management that comes with using these technologies.

Dangerous Directions

The downsides of blindly relying on algorithms are exemplified by what happens when small errors surface in navigation software. GPS navigation aids allow us to take our focus away from navigating—sometimes with disastrous results.

In 2008, a bus carrying the Garfield High School softball team crashed into a pedestrian bridge in Seattle, sending twenty-one kids to the hospital (luckily, with only minor injuries). The driver’s GPS had routed him under the bridge, even though it was too low for a bus. Why didn’t he pay attention to the low bridge as he approached it? One reason, perhaps the reason, is because the driver outsourced his thinking to the technology. An algorithm had given him the route, so he didn’t stop to think about the bridge’s height. You see, the GPS had a “bus” setting.

The driver and the bus company had not considered the possibility that the system could mislead. As the president of the bus company put it, “We just thought it would be a safe route because, why else would they have a selection for a bus?”6 The bus setting gave them a false sense of security.

In a similar case in 2013, Apple Maps routed drivers across an operating runway at Alaska’s Fairbanks International Airport.7 Drivers mindlessly continued beyond road signs warning them of the active runway and drove onto the airport grounds. Listening attentively and focused on those directions, the drivers stopped thinking about where they were actually driving. To prevent a real disaster and loss of life, airport officials quickly erected barricades to prevent more of the same mindless and very risky outsourcing of thought to GPS-dictated directions.

And in one of my favorite examples of GPS-managed attention, a man drove his van up a small hiking trail (labeled by locals as a glorified goat path) until the system announced he should turn around.8 At that point, he was halfway up a mountain and couldn’t turn around. A heavy-lift helicopter was deployed to rescue him and get the van off the trail. Clearly, looking out the window, rather than listening to the computer-generated directions, would have been more productive.

Blind reliance on technology can literally lead us astray. Lest you think this is just an American problem, consider an experience I had in South Africa. I was invited to address a large audience in Johannesburg, and on the day prior to my speech, my hosts had arranged for me to appear on various TV shows to discuss the global economy and its implications for South Africa.

I was scheduled to appear on CNBC’s Powerlunch Africa show and was told the studio was not far from my hotel. Given the warm climate, my hosts had a member of their media relations team members drive me to the specific building and exact location. Fighting a bit of jetlag, I asked if we had time for a cup of coffee (it was more than an hour before I was scheduled to be on air) and the poised woman obligingly stopped when I spotted a coffee shop. As we sat and talked about the likely questions I would be asked, we suddenly realized that I was scheduled to be on air in approximately twenty minutes. She seemed nervous so I asked her if there was a problem. She said there was no problem, but she felt we should probably walk as the traffic had grown substantially while we had chatted. She put the address into her phone’s mapping app and off we went.

Fifteen minutes later, I was sweating like I had just run a marathon, and her app said we had another two kilometers to the destination. We picked up the pace and after making a left turn, found ourselves on the same block that we had gotten coffee. The studio was literally 100 yards from the cafe. I ended up postponing my interview for a bit to accommodate a cool-down period during which I consumed Gatorade and used the hand blowers in the bathroom to dry my shirt. Having sprinted in a circle before a major media appearance, I was so disturbed that I had to investigate what happened. I asked to see her phone and noticed she had set the GPS directions to “driving” rather than “walking,” meaning she blindly took me on a four kilometer circle because of one-way streets.

In what may be the most extreme case of pushing back against the impact of mindless reliance on artificially intelligent navigation systems, one Italian town recently banned Google Maps after too many people had gotten lost while following the directions provided by the app.9 Salvatore Corrias, the mayor of a seaside town on the island of Sardinia, said that “too many sedans and small cars get stuck on impassible paths.”10 An article in October 2019 noted the town had conducted 144 rescue missions in the prior two years. To combat the growing problem, the town has contacted Google and has placed signs alongside roads that read, “Do not follow the directions of Google Maps.”11

Deadly Dependence

In 2014, Vanity Fair featured a gripping story by William Langewiesche that described the crash of Air France 447; titled “The Human Factor,”12 the article showed how blind reliance on technology resulted in a (potentially) needless loss of lives. Air France 447, an Airbus 330 that boasted new technology to constrain pilots to within the plane’s capability limits, took off smoothly at 7:29 p.m. and hugged the Brazilian coast before setting off across the Atlantic. As the plane moved offshore, the cockpit received a message from the dispatchers in Paris: a line of thunderstorms was developing directly on the charted course. Pierre-Cedric Bonin, the pilot flying the plane, grew nervous, but the pilot in charge, Marc Dubois, expressed little concern.

The plane was on autopilot, heading toward Paris at 550 miles per hour, 35,000 feet above the sea. They were about to enter the Intertropical Convergence Zone, an area near the equator where thunderstorms are common. After dodging some bad weather, the pilots found themselves flying through a barrage of ice crystals. The turbulence wasn’t particularly unusual, but ice crystals bombarded the windshield.

At 11:10 p.m., the cockpit’s pitot tubes (a technology that measures air speed) failed, suggesting the plane’s speed had suddenly dropped dramatically. The dashboard also hinted that the plane had lost a bit of altitude. The autopilot system, which required airspeed data, disengaged. Alarms sounded. The pilots were now actually flying the plane, free and clear of automated controls.

Bonin immediately pulled the plane up to gain altitude, which reduced the speed. A stall warning came on, generating even more confusion in the cockpit. As a plane slows down, it loses lift, threatening its ability to stay aloft. To regenerate the speed necessary to maintain lift, Bonin would have to lower the nose of the plane. But, in the heat of the moment, that’s not what he chose to do. Instead, Bonin continued to pull up on the stick.

Another pilot, recognizing the speed indicators were malfunctioning, encouraged Bonin to point the nose of the plane down. Despite answering with, “OK, I’m going back down,”13 Bonin merely lowered the rate of ascent. The plane continued to lose velocity.

Eventually, the pilots lowered the nose enough that the climb leveled off. The plane was neither gaining altitude nor losing speed. If they had lowered the nose another few degrees, they would have been back where they started. But that’s not what happened. Instead of lowering the nose, Bonin again pulled back on the stick. The plane ascended, generating a cacophony of stall warnings in the cockpit. The airspeed indicators were now working. After repeated stall alarms, the plane stopped ascending around 38,000 feet in the sky and started falling with its nose up. The plane began barreling toward the ocean at 3,900 feet per minute.

The plane rapidly fell below 35,000 feet, its plunge rate accelerating to 10,000 feet per minute. The nose was so high that the computers began rejecting the data as invalid. Warnings stopped. But as Bonin tried pointing the nose down, the warnings again rang, leading him to again pull up. The descent rate increased to 15,000 feet per minute. The plane continued to plunge toward the ocean.

Four minutes and twenty seconds after the airspeed indicator failed, Flight 447 belly-flopped into the Atlantic Ocean. All 228 passengers died. Sadly, in this case, the consequences were deadly.

Protocols and Rules

In addition to people commanding our attention and technology hijacking our thinking, rules manage our focus and frame our decisions. They limit what we and the people around us can do, and they give us a false sense of security. In taking options off the table, they narrow our focus on what remains. In trying to prevent undesirable outcomes, they make us overconfident that they have been successfully ruled out. Rules, it seems, are a means of embedding managerial expertise into a set of procedures that employees are asked to blindly follow.

Bureaucratic Blues

Of all the organizations blinded by overfocused rules, the Division of Motor Vehicles (DMV) looms largest in many Americans’ minds. The average American gives the DMV as much respect as racist, drug-dealing sex offenders who evade alimony payments. Visiting the DMV is not a particularly enjoyable experience, either. Most Americans would probably prefer a root canal.

Try typing in “DMV horror” or “DMV nightmare” into Google and you’ll get hundreds of thousands of hits within half a second or so. Many are so disturbing and ridiculous you simply can’t help but laugh. And they’re all the result of an overreliance on rigid rules: broken vision machines resulting in people being listed as blind in an official government system; typographical errors that can’t be corrected because the system doesn’t allow for changes; or bureaucratic shuffling that infuriates some and necessitates blood pressure medication for others. Go here, get that, talk to them, fill out this form, submit that. If you’re still not entertained after reading several of these accounts, jump on YouTube, type in “DMV crazy,” and you’ll be presented with tens of thousands of videos of Americans who lose their composure in the face of DMV frustration.

Consider this story of an employee who couldn’t look beyond the color of a book that she had to consult. Here’s the situation, which took place in April 2000.14 A person purchased a used car in California that was manufactured in 1981. He lived in Nevada, so he went to his local DMV in Henderson to register his vehicle. He arrived prepared with the signed California title, inspection documentation, proof of Nevada insurance, and all other paperwork filled out. To assess a fair tax on the purchase, the state of Nevada needed to determine the current market value of the car. DMV protocol is to use the Kelley Blue Book value or to adjust the original manufacturer’s suggested retail price (MSRP) for depreciation since it was made. The problem: the Kelley Blue Book available to the clerk didn’t have car values going back to 1981 models.

So the clerk was stumped, because she blindly followed the DMV’s rigid rules. The car’s owner noticed a NADA guide (a similar price estimation manual produced by the National Automobile Dealers Association, but one which goes back further in time) behind the counter and asked the clerk to consult it. She refused to do so, insisting she had to get the blue book value, and the NADA book is not blue. Frustrated, the customer tried another route. He explained that the car would be completely depreciated at this point and offered to pay the minimum tax.

The clerk rejected the offer and suggested the car owner try getting the original window sticker. His response: “The former owner is dead! I bought the car off his surviving family! There is no way that I can provide you with such a document.”15 She couldn’t move on, insisting on a blue book value or an original purchase price. Those are, after all, the DMV’s rules. The clerk concluded that there was nothing further she could do and motioned for the owner to leave, noting that there were other customers waiting.

After a heated standoff, a supervisor eventually entered the scene and located the original MSRP for the car in the NADA guide. Even then, the clerk insisted that she could not use the price because it wasn’t from the blue book. The supervisor explained that blue book is just a phrase for used car and truck prices because it’s the most commonly used book. After several agitating hours, the customer’s paperwork was accepted in its original form.

Recently, Kristen (yes, my patient movie-watching companion from chapter 1) and I encountered a silly set of rules. We went out with our kids for an early dinner, so early that those without children might have confused it with a late lunch. Our destination was a relatively nice Italian restaurant. It was almost completely empty. But when we asked for a table for four, the very pleasant host looked down to consult his computer. Two minutes later, he lifted his head and said, “It’ll be about thirty minutes.” My wife and I laughed. We merely looked over his shoulder at the three tables with people and the other twenty-seven or so empty tables. My watch read 4:20 p.m. Thirty minutes might as well be seven hours when you’re with two hungry kids.

I started to say “Thanks, maybe another time ” but my wife couldn’t help herself.

She said, “I’m sorry. I may be mistaken, but it appears you have tables available. Are you expecting a mad rush based on reservations?” (This wasn’t as cynical as you might think. We do, after all, live in a family-oriented suburb of Boston and young families eat early.)

The host said that he was not allowed to sit more than fifteen people in any thirty-minute window. He then pointed to the twelve people in the dining room and indicated that we would tip the balance to sixteen, and that was against the rules. He needed to pace arrivals so the kitchen wouldn’t “get slammed.” After I suggested he could consider two children the equivalent of one person, he leaned in and whispered, “We’re not that kind of restaurant. We don’t cut corners.” His focus on following the rules led to the perplexing situation where four hungry diners were sent away in the name of delivering a good dining experience.

Rigid rules may exacerbate and intensify, rather than mitigate, the very risks the experts are trying to minimize by imposing the rules. The restaurant rule was probably to ensure a good dining experience, but it instead annoyed four potential customers.

Sick Systems

In late September 2014, Thomas Eric Duncan returned home to Dallas from a trip to Liberia, ground zero of a raging Ebola epidemic.16 He suffered from severe abdominal pain and had a high and rising fever. Concerned, he went to Texas Health Presbyterian Hospital Dallas, where he was subjected to a battery of questions and simple diagnostic tests. When his temperature was taken, it registered 103 degrees.17 Despite the media frenzy describing the rapidly spreading disease in Africa, and the news that was dominating virtually all of the American media, Ebola was not seriously considered as a possible cause of his ailments during this first visit. His travel history was not given adequate attention. A mere thirty-five minutes after the concerning thermometer reading, health-care workers sent him home when his temperature dropped to 101.2 degrees, as the rules indicated that was the correct course of action.18

He later returned to the hospital and was admitted a second time, now with his family saying they believed he had Ebola. He had all the symptoms. Despite these conditions, hospital protocols did not dictate elevated caution until a confirmed diagnosis had returned from the labs. And so, anyone around the infected patient was no more protected from Duncan than if he had the flu.19

Eventually, Texas Health Resources, the parent company of the hospital, announced Thomas Eric Duncan was America’s first confirmed Ebola patient. Americans on Main Street panicked alongside those on Wall Street. Cruise ship and airline stocks plunged as fears of enclosed public travel escalated. Trips were cancelled. The fear was palpable.20

Amidst the chaos and confusion, Nina Pham, a twenty-six-year-old nurse working at the hospital’s intensive care unit, was assigned to care for Duncan. To reduce her risk of exposure to the deadly virus, based on what she could learn from the internet, she donned protective gear, leaving only her neck and hair exposed. A little over a week later, Duncan died.

Shortly thereafter, Pham woke up with a fever, checked herself into the hospital, and tested positive for Ebola.21 At the same time, the US Centers for Disease Control (CDC) was closely monitoring another nurse, Amber Vinson, who had cared for Duncan along with Pham while he was sick; she had even put a catheter in him while he was in the midst of extreme vomiting and uncontrollable diarrhea. As a result of being in close direct contact with two confirmed Ebola patients, Vinson was asked to check in with the CDC twice a day.

While on a trip to Cleveland, Vinson developed a slight fever. According to CDC protocol, she was required to check in with the center before boarding any flight, so before returning to Dallas, she called the CDC. The representative asked Vinson for her temperature. She reported 99.5 degrees. The decision rule had a no-fly trip point of 100.4 degrees. As a result, Vinson was cleared to board Frontier Airlines Flight 1143 along with 132 unsuspecting passengers. Less than thirty hours after boarding that flight, Vinson was diagnosed with Ebola.22

Let’s recall that Vinson was asked to report to the CDC twice per day precisely because she had been exposed to two confirmed Ebola-infected people. Surely the CDC representative might have asked her to stay put for another twenty-four hours to see if her temperature was rising or falling. Yet the focus upon her body temperature blinded the agent from seeing the very obvious context that demanded an abundance of caution.

Nina Pham and Amber Vinson both recovered from Ebola. However, because of a mismatch between rules and the situation at hand, they unnecessarily contracted the deadly disease and, just as avoidably, needlessly exposed others to an elevated risk of contracting it. The system was designed to prevent this exact risk, but it failed because judgment and common sense were overrun by adherence to a strict protocol.

Outsourcing Thought and Attention

An explosion of choice in modern society has increased pressures to focus and filter. Realistically, to make choices, we defer to the opinions of experts or friends, we follow rules of thumb, we use recommendation algorithms—any kind of filter that we think will help us choose well. “Nobody ever got fired for buying IBM,” the corporate IT cliché goes. Even if the IBM system falters, that’s IBM’s fault, not yours, since who wouldn’t choose IBM? As Keynes wrote, “Worldly wisdom teaches that it is better to fail conventionally than to succeed unconventionally.”23

Making conventional choices minimizes regret (and may protect your career), even if it isn’t the best option given your needs. But in the process of culling, we often cede power to filters in ways we aren’t mindful of. Our decision frames are set by others, and we forget to keep track of what options and factors we ignore, opening ourselves up to unnecessary risks and missed opportunities. We are ill-equipped to notice when we’re being misdirected by our filters, since they define our very field of perception. The key is to take a step back and ask: What are you losing when you narrow your option set?

We must relearn how to think for ourselves. This does not mean that we cannot rely on others; life today is simply too complex to not do so. But it does mean we should be mindful when doing so. Blind reliance is a recipe for disappointment.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset