Introduction

The journey of problem solving

Let’s define problem as the gap between where you are and where you want to be. A problem isn’t necessarily negative; it can also be an opportunity. At home, you might be deliberating with your spouse whether to buy a house, where to retire, or which car to buy. In a business setting, you might want to select an enterprise-resource-planning platform, decide whether to acquire a competitor, figure out how to respond to a government’s threat to erect tariffs, or, indeed, respond to a competitor whose latest plan threatens your market share. Problems are everywhere. We, as individuals and managers in organisations, face them constantly.

This book provides you with a structured process that guides you through the steps to solve complex problems. You will learn to frame your problem, explore potential alternatives, and decide which alternative, on balance, is superior. The book relies on our learnings from teaching this material to hundreds of executives. It provides many hands-on tools such as case studies and, because the immense majority of the problems we face require us to meaningfully engage stakeholders, the book also has ‘try this’ exercises that will give you concrete ways to interact with your stakeholders.

In this introductory chapter, you will find out why it’s worthwhile to become a good problem solver, why that is not an easy task, and what you can do to develop your problem-solving skills.

 GOOD PROBLEM SOLVERS ARE ­POPULAR . . . BUT HARD TO FIND

From the World Economic Forum to McKinsey, there is widespread agreement that problem-solving skills are of paramount importance.1 Problem-solving skills often come first on lists of desirable skills, even ahead of other critical ones, such as communication or the ability to deal with ambiguity.

And yet, business education, in particular, does not equip students with good problem-solving skills.2 It is not surprising, therefore, that employers say it is hard to find people with these capabilities.3 In short, become a better problem solver and your popularity will shoot up!

A scarcity versus importance matrix.

 SOLVING COMPLEX PROBLEMS IS HARD

If problem-solving skills are in such great demand, why aren’t people developing them more? One key reason is that learning to solve the kinds of problems that executives solve – complex problems4 – is hardly a cakewalk. But before we get into that, let’s take a step back.

With our definition of a problem (a gap between a current and a desired state), we spend most of our waking hours solving problems, from selecting which socks to wear in the morning to ‘betting the farm’ on a new strategy. So, there are problems and problems. This book focuses on a subset, so-called CIDNI (pronounced ‘seed-nee’) problems that are characterised by three defining features:

  • Complex (C) means that the current and desired states and the obstacles we face along the way are diverse, changing, interdependent, or not transparent.5 What will our profitability be next year? Well, it depends on our revenues on one side and our costs on the other. Closing one of our stores, for instance, would reduce our costs (yay!) but probably also reduce our revenues, so revenues and costs are interdependent.
  • Ill-defined (ID) means that the current state and final state are unclear, and so might be the obstacles along the way.6 The problem may not have any solution at all, and it usually does not have one ‘right’ solution. Ill-defined problems are typically one of a kind, and what constitutes their best solution is at least partly subjective – yes, we may all agree that we should release a product that is high quality and cheap, but we might assign different importance to these two attributes, thereby causing us to disagree on which solution is best.
  • Non-immediate but important (NI) means that we don’t need a solution right away, we have a few days, weeks, or even months to develop one, so we can follow a systematic process to address the problem. In other words, the quality of the solution we ultimately choose is more important than how fast we end up finding it.
A definition versus complexity graph.

Kate considers various job offers

Kate is a business unit manager at a large multinational consumer goods company, who is facing a challenging career choice. She has been with the company for 15 years and, although successful, she has grown tired of her job. Over the last few months she has explored new opportunities outside the company, which generated three good, but not perfect, offers. As she considers which to choose, she realises that many factors are important to her: the salary, of course, but also opportunities to progress, the quality of her would-be colleagues, and the potential need to relocate. Finally, she needs to bring her fiancé into the decision, because she values his input and because having his support will help her in her new position.

Kate’s problem is an example of the CIDNI problems we frequently face at home and at work. Its complexity stems from the interconnectedness of its parts: the job that pays best is outside her preferred region of the world. The challenge is also ill-defined because it’s unclear to Kate exactly how much she values opportunities to progress versus the quality of her colleagues, and it’s also unclear whether her fiancé will have compatible preferences. Finally, although Kate’s challenge is important, she has time to think through it; that is, it is non-immediate.

CIDNI problems7 typically don’t have an obviously superior solution. Instead, they require trading off the pros and cons of alternatives across various criteria that are all important to us. As a result, solving them is a subjective exercise that involves substantial uncertainty and risk.

Trading off these pros and cons is already challenging when we are solving problems on our own, but we typically need to engage various stakeholders – spouses, children, and parents; colleagues, subordinates, and bosses – who are unlikely to all want the same things. As more people need to be integrated into the solution process, the complexity increases.

One size doesn’t fit all

Because the complexity of the problems we face varies tremendously, we shouldn’t solve them all in the same way.

For many problems, investing in an elaborate solution process makes little sense. Results from one study show that Netflix watchers spend an average of 18 minutes deciding what to watch and that 40% of the people surveyed wanted to watch something different from their significant other8 (Given these circumstances, 18 minutes actually does not seem so bad!). Having said this, if you conduct such an in-depth analysis every time you select your socks in the morning, your day will grind to a halt before you can say ‘paralysis by analysis’, all for a small payoff.

Instead, for many problems we’re better off relying on routines, habits and intuition. Psychologists call this System 1 thinking; an automated pathway in our brain that enables us to access lots of data quickly and effortlessly. Using System 1, we make decisions quickly without our conscious minds influencing them. This capability probably evolved when our ancestors were chased by all sorts of long-toothed animals, when it was a literal life saver. There’s only so many times that you can ask yourself whether that noise in the nearby bush is from a rabbit or a lion before you get eliminated from the gene pool. Instead, having an automatic ‘get-the-heck-out-of-here-now!’ function hardcoded in our brain enabled the species to thrive.

For many of our day-to-day decisions, System 1 is the better way to go: It’s what enables us to step on the brakes before we hit that car in front of us. We don’t need to make a conscious decision, because our intuitive answer usually does a good job, and it does so incredibly fast. It is because we can rely on System 1 thinking for many small decisions that we can get through our days with reasonable efficiency.

However, the speed and relative ease of System 1 thinking comes at a cost: It doesn’t care about the quality of the evidence it uses to make the decision. This is particularly relevant as many of today’s challenges are probably a lot more complex than those we faced when our neural thinking and decision-making mechanisms came to be.

Most notably, by offering a forceful response to what Nobel laureate psychologist Daniel Kahneman calls WYSIATI (for ‘what you see is all there is’), System 1 leaves us vulnerable to various cognitive biases, which pop up time and again as we solve complex problems. Frequent biases include:

  • Confirmation bias, where we search for and interpret data in a way that confirms our existing beliefs.9 For instance, when we read the five-star reviews of a restaurant that we are considering but ignore the one-star reviews.
  • Status-quo bias, where we prefer not to change anything as we perceive all other avenues would be a loss.10
  • Bias-blind spot, where we perceive ourselves to be less biased than others.11
  • Anchoring, where we rely too heavily on the first piece of information considered when making a judgment, even if it’s entirely unrelated.12

The list goes on. Taxonomies routinely include over 150 biases and even after removing those that significantly overlap, a good 100 remain.13

In short, when dealing with complex problems, we can’t blindly trust our intuition because it will cause us to fall into many traps. System 1 seems appropriate only in those settings where you’re likely to choose a good answer, mistakes have a low cost, and a fast answer is valuable.14 If these conditions aren’t met, however, we’d better consciously initiate a more deliberative, slow, and effort-intensive approach; we need to engage our System 2 thinking.

A diagram shows the system between where I am and where I want to be.

To be clear, System 2 doesn’t guarantee that we won’t be unbiased. Completely getting rid of biases is extremely hard and believed by many to not be possible. But engaging System 2 should help us be less biased.15

To sum up, when facing a complex problem, we must pay attention, which implies that we must slow down. Pretty simple, right? Well, almost. One big difficulty is that System 1 thinking is always running in the background. It is our default approach to dealing with life, so, before we even realise it, we’ve ‘solved’ the problem by answering with an expletive that gets us thinking regretfully, ‘Did I really just say that?’

Engaging System 2 takes conscious effort. It requires us to Stop. Think. And only then Act. Which is harder than it sounds when we operate in the heat of the moment. Yet, if we don’t do it, we let our biases take over with all the negative consequences that doing so might entail.16

We fail to solve complex problems in many ways

There are many ways to fall short during the solution process. While helping executives, we see the six listed below particularly frequently.

  • We frame the problem poorly: Framing the problem means defining what it is and what it is not. Do you really want to increase revenues, or do you want to increase profitability . . . or your return on investment? Although all three frames address the same theme, they don’t have the same scope. Framing our effort on increasing revenues, for instance, would leave out reducing costs, whereas framing on increasing profitability would consider both. For complex problems, framing effectively is harder than it appears, because System 1 tells us that we already know what the problem is. ‘Stop wasting time reviewing useless information’ goes the autopilot in our head, ‘we know what we want, let’s get on with it’. However, there’s usually value looking beyond the surface features of a complex problem to validate that what we see is the disease and not just one of its symptoms. That’s why Chapters 1, 2, and 3 describe the science and art of framing, enabling you to gain an in-depth understanding of your problem that you can summarise in one overarching key question, your quest.
  • We make bad decisions . . . : In a McKinsey survey, 72% of senior-executive respondents felt that bad strategic decisions in their organisation were at least as frequent as good ones.17 At some point during the solution process, we need to decide which alternative to pursue, which also means that we need to decide which alternatives not to pursue. Yet, making that selection is not trivial. We often see executives entering this stage with preconceptions of what the path forward should be. They spend much of their energy pushing their agenda without seriously exploring alternatives; they ask questions to advocate rather than enquire. As a result, they miss out on solutions that, unbeknownst to them, they might have preferred! Although forcefully plowing ahead gives us the impression that we are getting things done, it might also result in selecting an alternative that we will end up regretting. ‘Be careful what you wish for’ applies to all of us. Chapter 4 explores how to create better alternatives.
  • . . . or we fail to decide altogether: If choosing a poor alternative can be disastrous, failing to decide can also be harmful. During the four years before Airbus’ announcement of the A320neo, Boeing had been debating whether to update the 737 or start afresh.18 Not deciding results in perpetuating the status quo, which might lead us to go from a suboptimal position to a downright unacceptable one. Facing uncertainty, we are often tempted to resort to a ‘wait and see’ attitude. Of course, we don’t rationalise it that way; instead we run more analyses, gather more data, engage further with stakeholders, build consensus, . . . and we sometimes miss our window of execution, as Boeing’s development of the 737 Max illustrates. Failing to decide is making an (implicit) decision. You’ll get practical tools to avoid this trap throughout the book and in particular in Chapters 7 and 9.
  • We don’t engage key stakeholders: Even a solution that seems superior may fail if key stakeholders don’t support it. This is especially true when the problem is controversial, with different stakeholders favouring different alternatives. Skilfully engaging these stakeholders makes them co-creators of the end product, thereby increasing the likelihood that they will support the solution.19 In addition, engaging helps us overcome our own blind spots and biases, as people with different viewpoints, backgrounds, skillsets and agendas are likely to contribute valuable ideas that we might not have.20 Furthermore, statistically speaking, pooling information from across independent people reduces noise, which results in more reliable information.21 And being exposed to team decision-making appears to help people make better individual decisions subsequently.22 Beware, though, as you can have too much of a good thing: under time pressure, for instance, engaging can be counterproductive. Similarly, broad engagement on trivial issues can be perceived as wasteful,23 and when members in a group already hold an opinion, group discussion can amplify that initial preference.24 In short, there are times and places where you need to be consultative and other times call for resolute leadership. The point is that you shouldn’t consult everyone on every decision but engage judiciously. You will see different ways to engage stakeholders effectively throughout the book.
  • We fail to update our thinking: Complex problem solving occurs under uncertainty in changing environments, which conflicts with the human need for certainty. As a result, we often form opinions early in the problem-solving process, seek evidence partially, and stick to our opinions, even as new evidence should lead us to revise our conclusions.25 One key tenet of scientific reasoning is to treat ideas as hypotheses that we test, updating our thinking as we learn.26 Doing so, we adopt a probabilistic mindset, where we don’t see things as right or wrong but we evaluate their probability, which we update when new evidence surfaces. Economist John Maynard Keynes pointed out, ‘When the facts change, I change my mind. What do you do, sir?’27 There is now empirical evidence that adopting such a scientific approach helps in entrepreneurial settings. This book shows you how to develop this mindset, particularly in Chapter 9.
  • We don’t execute: Identifying a good solution is required to solve our problem, but it’s not enough. Ultimately, we also need to execute it successfully. Although execution is beyond the scope of this book, we will address this all-important theme throughout.

 THIS BOOK OFFERS A SOLUTION

So, we all face problems of various complexity that call for different solution processes. The simplest problems should be solved with System 1,28 whereas complex problems call for a more deliberate approach. What’s ‘more deliberate’ then? The last decades have seen the development of a myriad methods to help executives make better decisions. And yet, those are rarely used in the field, arguably because they are often too far removed from practical requirements and too demanding to use.29

Solve your problem using FrED

What is needed is an approach that helps safeguard us against our fallible instincts, has the versatility to apply to problems of different degrees of complexity, and remains easy enough to use. We’ve developed such an approach; we call it FrED:30

  • Frame answers: What is my problem? Framing consists of defining the problem, synthesising it into a single overarching key question, the quest.
  • Explore answers: How may I solve my problem? Exploring consists of identifying potential answers to the quest – your alternatives – and the criteria that will help you decide how much you like each.
  • Decide answers: How should I solve my problem? Deciding consists of identifying which alternative, on balance, you prefer.
A loop diagram depicts FrED. The three steps are: frame, what's my problem; explore, how may I solve my problem; decide, how should I solve my problem.

Think of FrED’s three steps as the three legs of a stool. You need all three, as there’s only so much that two can do to compensate for a weak third. In other words, to arrive at a good outcome, you can’t do a poor job in any of the three steps.

Note as well that framing is problem centric whereas exploring and deciding is solution centric. We all have a tendency to jump quickly to being solution centric (that’s our System 1 in action. Yet it pays off to learn to be comfortable being uncomfortable. That is, spend time to better understand the problem; it’s okay if we do not have answers right away.

Although FrED’s three steps appear linear, in practice you will iterate as new evidence leads you to update earlier conclusions. In fact, rather than starting with framing, problem solving often starts with an idea popping in your head as you take a walk or talk with someone. Nothing wrong with that; with FrED, you can start anywhere and go in any direction!

Dragon Master™, your companion app to solving complex problems

We’ve developed an app that can help you solve your complex problems. Called Dragon Master™ (for reasons that will become obvious in just a few pages) it enables you to go through the FrED process in one go.

Dragon Master™ is free and accessible at dragonmaster.imd.org

Don’t let FrED’s simplicity deceive you. It is immensely versatile, applying to whichever challenge you face, no matter what its nature. We’ve used it to help people solve problems in disciplines as varied as business strategy, particle physics, medicine, architecture, and philosophy. You can adapt its granularity to match your circumstances, simply using it as a mental roadmap if you need to organise your thinking on the fly all the way down to structuring a multi-year project.

Tested and refined over hundreds of projects, FrED has shown it promotes problem solving in two major ways. First, it helps you provide clear direction to yourself, your team and your organisation about what you’ll do. Second, it helps you engage your key stakeholders better. As we’ll see, it is critical to do both: provide direction and engage stakeholders. If you only provide direction (adopting ‘my way or the highway’ as a motto), you are likely to lose your stakeholders along the way. Similarly, a pure focus on engagement (thinking ‘let a 1,000 flowers bloom’) puts you at risk of ineffectively investing your limited resources. That’s why the book ­continuously moves back and forth between these two poles.

An engagement versus direction graph depicts FrED-powered problem solving.

By providing an overarching structure to organise your problem-solving effort, FrED is like an operating system. Much like Windows or MacOS, it provides a stable underlying platform on which you can run specialised analyses – financial, marketing, or supply chain analyses – that are required to solve your problem. More specifically, FrED allows you to gain systematic insights into your problem that help you go beyond your intuition while remaining simple to apply.

A graph shows the sweet spot FrED.

One final point of introduction about FrED. It is common in managerial settings to follow the recommendations of ‘gurus’ who make arguments that sound reasonable but have little empirical support.31

As strong proponents of evidence-based management, we have strived to include in this book ideas that have solid empirical support. Some of this support comes from our own experience coaching hundreds of executives. When that’s the case, we typically present the source as ‘in our experience’. Most of the ideas, however, come from a body of literature in the social sciences, engineering, design, medicine, and other disciplines that have been tested under better controlled conditions. We have strived to indicate these sources so that you can increase your confidence in the findings that have the stronger empirical support.32

One of these sources of empirical findings is the Crew Resources Management (CRM) literature. Over the past few decades, the aviation industry has taken unprecedented steps to improve how airline crews make decisions, steps that have coincided with a drastic drop in fatalities.33 The CRM literature centralises these learnings, providing a strong body of empirically derived knowledge that is getting adopted in other areas, including the maritime and the healthcare industries.34

Because CRM practices leverage an optimal set of characteristics – centralised reporting systems of accidents, systematic investigation of accidents, development and testing of prescriptive rules, and so on – we believe they provide high-quality evidence that can help decision making in other settings, including managerial ones. As a result, we call on this body of knowledge extensively throughout the book.

In the end, however, the applicability of any finding to another setting depends on the validity of the original finding and on its capacity to be validly transposed to the new setting. Although we believe that the ideas we present score well on both counts, we strongly encourage you to engage your critical thinking and test each claim for yourself.

How to use this book

Mimicking FrED’s structure, the book has three parts: frame, explore, decide.

Part I, Frame, describes the science and art of framing, enabling you to gain an in-depth understanding of your problem that you summarise in one overarching key question, your quest. More specifically, Chapter 1 lays out how to define your quest and contextualise it by introducing a hero, a treasure and a dragon. Chapter 2 shows how to fine-tune your hero-treasure-dragon-quest sequence by using four rules. Then, Chapter 3 helps you sharpen your understanding even further by exploring the problem’s underlying root causes.

Part II, Explore, sets the stage for exploring potential alternatives and criteria to solve your problem. Chapter 4 shows how to explore the solution space which results in concrete alternatives, using how maps. Chapter 5 helps you explore, articulate and weigh relevant decision criteria that will help you identify the most promising alternatives.

Part III, Decide, builds on the work that you have done in the first two parts, to help you make well thought-out decisions. Chapter 6 lays out how to evaluate and compare alternatives using a weighted set of criteria. Chapter 7 shows how to make interdependent decisions across multiple domains of choices. Chapter 8 shows how to summarise your conclusions in a compelling message to win the support of your key stakeholders. Finally, Chapter 9 helps you move forward under uncertainty, showing you how to adopt a probabilistic mindset suited to complex situations and linking the strategising that you have done throughout the journey with the execution still ahead.

As you read through the chapters, relating the ideas and tools to your challenge, don’t hesitate to skip back and forth between different chapters. FrED is best used as an iterative process where insights from one step help you modify the conclusions you have reached in previous steps.35 Don’t look at this as a failure but as a welcome progress towards being less wrong, towards reaching better conclusions.

In practice, we have found it useful to go through the whole book once to understand the bigger picture before deep diving into the chapters that best address the most important issues you face. In parallel, the dedicated Dragon Master™ app will help you capture your thinking at all three stages of the process.

 CHAPTER TAKEAWAYS

Problem solving is bridging the gap between where you are and where you want to be. As such, we constantly solve all sorts of problems.

CIDNI problems are particularly relevant. ‘CIDNI’ stands for complex, ill-­defined, non-immediate but important. In the following, we shorthand CIDNI problems as simply complex.

For simple problems, following our intuition (System 1 thinking) is fine. For complex problems, however, intuition is dangerous because it makes us particularly vulnerable to various biases. Instead, we should be more deliberative, engaging System 2 thinking.

One way to engage System 2 thinking is to use a three-step approach to solve complex problems: FrED.

  • Frame answers: What is my problem? Framing consists of defining the problem, synthesising it into a single overarching key question, the quest.
  • Explore answers: How may I solve my problem? Exploring consists of identifying potential answers to the quest – your alternatives – and the criteria that will help you decide how much you like each.
  • Decide answers: How should I solve my problem? Deciding consists of identifying which alternative, on balance, you prefer.

 INTRODUCTION NOTES

  1.   1For WEF reference, see page 22 of World Economic Forum (2016). The future of jobs: Employment, skills and workforce strategy for the fourth industrial revolution. Global Challenge Insight Report, World Economic Forum, Geneva. For McKinsey reference, see McKinsey Quarterly (2020). Five fifty: Soft skills for a hard world. National Research Council (2011). Assessing 21st century skills: Summary of a workshop.
  2.   2Bunch, K. J. (2020). ’State of undergraduate business education: A perfect storm or climate change?’ Academy of Management Learning & Education 19(1): 81–98.
  3.   3See page 44 of PWC (2017). The talent challenge: Harnessing the power of human skills in the machine age.
  4.   4There’s no unified definition of what a complex problem is. For discussions, see Dörner, D. and J. Funke (2017). ‘Complex problem solving: what it is and what it is not.’ Frontiers in Psychology 8: 1153.
  5.   5See, for instance, p. 5 of Mason, R. O. and I. I. Mitroff (1981). Challenging strategic planning assumptions: Theory, cases, and techniques, Wiley New York. See pp. 87–90 of Mason, R. O. (1969). ’A dialectical approach to strategic planning.’ Management Science 15(8): B-403-B-414; Wenke, D. and P. A. Frensch (2003). “Is success or failure at solving complex problems related to intellectual ability?” The psychology of problem solving. J. E. Davidson and R. J. Sternberg. New York, Cambridge University Press: 87–126.
  6.   6See p. 4 of Pretz, J. E., A. J. Naples and R. J. Sternberg ibid. Recognizing, defining, and representing problems: 3-30; see p. 462 of Smith, S. M. and T. B. Ward (2012). Cognition and the creation of ideas. Oxford handbook of thinking and reasoning, Oxford: 456–474.
  7.   7For brevity, we’ll refer to CIDNI problems from now on as complex problems.
  8.   8Goldman, R. and C. Gilmor (2016). New study reveals we spend 18 minutes every day deciding what to stream on Netflix. IndieWire.
  9.   9See, for instance, Nickerson, R. S. (1998). ’Confirmation bias: a ubiquitous phenomenon in many guises.’ Review of General Psychology 2(2): 175–220.
  10. 10See, for instance, Kahneman, D., J. L. Knetsch and R. H. Thaler (1991). ’The endowment effect, loss aversion, and status quo bias.’ Journal of Economic Perspectives 5(1): 193–206.
  11. 11Pronin, E., D. Y. Lin and L. Ross (2002). ’The bias blind spot: Perceptions of bias in self versus others.’ Personality and Social Psychology Bulletin 28(3): 369–381.
  12. 12Anchoring is tricky. Tversky and Kahneman’s 1974 experiment is a classic example of anchoring. They generated a random number between 0 and 100 before asking participants to decide whether the percentage of African countries in the UN was above or below that number. Anchoring was exposed when the participants’ median estimate was larger when the random number was high than when it was low. See Tversky, A. and D. Kahneman (1974). ’Judgment under uncertainty: Heuristics and biases.’ Science 185(4157): 1124–1131).
  13. 13For a taxonomy, see Dimara, E., S. Franconeri, C. Plaisant, A. Bezerianos and P. Dragicevic (2018). ’A task-based taxonomy of cognitive biases for information visualization.’ IEEE Transactions on Visualization and Computer Graphics 26(2): 1413–1432. See also Yagoda, B. (2018). ’The cognitive biases tricking your brain.’ The Atlantic (September).
  14. 14See p. 79 of Kahneman, D. (2011). Thinking, fast and slow. New York, Farrar, Straus and Giroux. See also Milkman, K. L., D. Chugh and M. H. Bazerman (2009). ’How can decision making be improved?’ Perspectives on Psychological Science 4(4): 379–383. For biases more specific to decision and risk analysis, see Montibeller, G. and D. Von Winterfeldt (2015). ’Cognitive and motivational biases in decision and risk analysis.’ Risk Analysis 35(7): 1230–1251.
  15. 15See, for instance, Lawson, M. A., R. P. Larrick and J. B. Soll (2020). ’Comparing fast thinking and slow thinking: The relative benefits of interventions, individual differences, and inferential rules.’ Judgment & Decision Making 15(5).
  16. 16Debiasing is hard. Kahneman, for one, is dubious that it can be done, and research has shown that merely knowing about a bias doesn’t prevent us from succumbing or make us aware that we have it (Pronin, E., D. Y. Lin and L. Ross (2002). ’The bias blind spot: Perceptions of bias in self versus others.’ Personality and Social Psychology Bulletin 28(3): 369–381). However, some ways have been proposed (Soll, J. B., K. L. Milkman and J. W. Payne (2015). A user’s guide to debiasing. The Wiley Blackwell handbook of judgment and decision making. G. Keren and G. Wu.) and recent research provides evidence that some training might help. See Morewedge, C. K., H. Yoon, I. Scopelliti, C. W. Symborski, J. H. Korris and K. S. Kassam (2015). ‘Debiasing decisions: Improved decision making with a single training intervention.’ Policy Insights from the Behavioral and Brain Sciences 2(1): 129–140. Sellier, A.-L., I. Scopelliti and C. K. Morewedge (2019). ’Debiasing training improves decision making in the field.’ Psychological Science 30(9): 1371–1379.
  17. 17We make bad decisions. Adopting a process perspective, a bad decision is the result of an unsystematic decision-making process, or to be more precise, a decision process that bypasses one or more of FrED’s steps. See Enders, A., A. König and J.-L. Barsoux (2016). ’Stop jumping to solutions!’ MIT Sloan Management Review 57(4): 63. See De Smet, A., G. Lackey and L. M. Weiss (2017). ’Untangling your organization’s decision making.’ McKinsey Quarterly: 6980. (De Smet, A., G. Jost and L. Weiss (2019). ’Three keys to faster, better decisions.’ The McKinsey Quarterly.) Management scholar Denise Rousseau identifies six organisational biases that lead to poor decisions. These overlap significantly with our own observations (slightly revised): solving the wrong problem, ignoring politics, considering just one alternative, focusing on a single criterion, narrow interests dominate, over-relying on easily available evidence. Rousseau, D. M. (2018). ‘Making evidence-based organizational decisions in an uncertain world.’ Organizational Dynamics.
  18. 18Campbell, D. (2019). ’Redline: The many human errors that brought down the Boeing 737 Max.’ The Verge 9. See also Clark, N. and J. Mouawad (2010). Airbus to update A320 with new engines and wings. The New York Times. Polek, G. (2011). Boeing takes minimalist approach to 737 Max. Aviation International News; Peterson, K. and T. Hepher (2011). Race is on for sales of Boeing’s MAX vs Airbus neo. Reuters. Hemmerdinger, J. (2021). ‘How and why Boeing re-engined the 737 to create the Max.’ FlightGlobal. Campbell, D. (2019). ’Redline: The many human errors that brought down the Boeing 737 Max.’ The Verge 92019. The Guardian (2020). Boeing 737 Max readies for takeoff after EU signals safety approval is imminent. Gelles, D., N. Kitroeff, J. Nicas and R. R. Ruiz (2019). Boeing Was ‘Go, Go, Go’ to Beat Airbus With the 737 Max. Herkert, J., J. Borenstein and K. Miller (2020). ‘The Boeing 737 MAX: Lessons for engineering ethics.’ Science and Engineering Ethics 26(6): 2957–2974. Smith, A., J. Maia, L. Dantas, O. Aguoru, M. Khan and A. Chevallier (2021). Tale spin: Piloting a course through crises at Boeing. IMD Case 7-2279.
  19. 19This tendency of people to value things that they contributed to create is sometimes called the IKEA effect. See Norton, M. I., D. Mochon and D. Ariely (2012). ’The IKEA effect: When labor leads to love.’ Journal of ­Consumer Psychology 22(3): 453–460.
  20. 20See, for instance, Pronin, E., T. Gilovich and L. Ross (2004). ’Objectivity in the eye of the beholder: Divergent perceptions of bias in self versus others.’ Psychological Review 111(3): 781. Pronin, E., J. Berger and S. Molouki (2007). ’Alone in a crowd of sheep: Asymmetric perceptions of conformity and their roots in an introspection illusion.’ Journal of Personality and Social Psychology 92(4): 585.
  21. 21Ariely, D., W. Tung Au, R. H. Bender, D. V. Budescu, C. B. Dietz, H. Gu, T. S. Wallsten and G. Zauberman (2000). ’The effects of averaging subjective probability estimates between and within judges.’ Journal of Experimental Psychology: Applied 6(2): 130. Johnson, T. R., D. V. Budescu and T. S. Wallsten (2001). ’Averaging probability judgments: Monte Carlo analyses of asymptotic diagnostic value.’ Journal of Behavioral Decision Making 14(2): 123–140.
  22. 22Maciejovsky, B., M. Sutter, D. V. Budescu and P. Bernau (2013). ’Teams make you smarter: How exposure to teams improves individual decisions in probability and reasoning tasks.’ Management Science 59(6): 1255–1270.
  23. 23De Smet, A., G. Jost and L. Weiss (2019). ’Three keys to faster, better decisions.’ The McKinsey Quarterly.
  24. 24This is called group polarization; see, for instance, Sunstein, C. R. (1999). ’The law of group polarization.’ University of Chicago Law School, John M. Olin Law & Economics Working Paper(91).
  25. 25See Samuelson, W. and R. Zeckhauser (1988). ’Status quo bias in decision making.’ Journal of Risk and Uncertainty 1(1): 7–59.
  26. 26We fail to update our thinking. Bayesian updating, a.k.a. updating one’s thinking in light of new evidence, is central to FrED, so we’ll talk more about it. For the benefit of scientific thinking in entrepreneurial settings, a recent study found that entrepreneurs who had trained in defining clear hypotheses, rigorously testing them, and deciding based on the results of the tests achieved significantly better results than others who didn’t get the same training; see Camuffo, A., A. Cordova, A. Gambardella and C. Spina (2020). ’A scientific approach to entrepreneurial decision making: Evidence from a randomized control trial.’ Management Science 66(2): 564–586.
  27. 27This quote is attributed to John Maynard Keynes with some controversy, as there are also sources that list economist Paul Samuelson as having coined a similar phrase. See Kay, J. (2015). ’Keynes was half right about the facts.’ Financial Times 4.
  28. 28System 1 and System 2 thinking. The terms were coined by psychologists Keith Stanovich and Richard West (Stanovich, K. E. and R. F. West (2000). ’Individual differences in reasoning: Implications for the rationality debate?’ Behavioral and Brain Sciences 23(5): 645–665.) and adopted by Kahneman (see pp. 20–28 of (Kahneman, D. (2011). Thinking, fast and slow. New York, Farrar, Straus and Giroux.). To learn more, Barbara Spellman’s introduction is a great primer (Spellman, B. A. (2011). Individual reasoning. Intelligence analysis: Behavioral and social scientific foundations. C. Chauvin and B. Fischhoff, National Academies Press). Kahneman’s Nobel lecture (Kahneman, D. (2002). ’Maps of bounded rationality: A perspective on intuitive judgment and choice.’ Nobel Prize Lecture 8: 351–401) offers a more detailed summary. For yet more, see Kahneman (2011).
  29. 29Riabacke, M., M. Danielson and L. Ekenberg (2012). ’State-of-the-art pre­scriptive criteria weight elicitation.’ Advances in Decision Sciences 2012, ibid.
  30. 30FrED comes from everywhere. We developed FrED based on our experience having worked with hundreds of executives and integrating the problem-solving approaches of multiple academic disciplines including the scientific hypothesis-driven approach (Gauch, H. G. (2003). Scientific method in practice, Cambridge University Press.), the TRIZ methodology in engineering, (Ilevbare, I. M., D. Probert and R. Phaal (2013). ‘A review of TRIZ, and its benefits and challenges in practice.’ Technovation 33(23): 30–37) and the design thinking way of, well, designers and the methods used by top strategy consultancies (Davis, I., D. Keeling, P. Schreier and A. Williams (2007). ‘The McKinsey approach to problem solving.’ McKinsey Staff Paper 66).
  31. 31Don’t trust the gurus! In medical settings, the opinion of experts is the lowest-grade evidence. For more, see for instance, Galluccio, M. (2021). Evidence-informed policymaking. Science and diplomacy, Springer: 65–74), Ruggeri, K., S. van der Linden, C. Wang, F. Papa, J. Riesch and J. Green (2020). ’Standards for evidence in policy decision-making.’
  32. 32For more on this practice see, for instance, p. xxii of Barends, E., D. M. Rousseau and R. B. Briner (2014). Evidence-based management: The basic principles, Amsterdam.
  33. 33Pasztor, A. (2021). The airline safety revolution: the airline industry’s long path to safer skies. The Wall Street Journal.
  34. 34See, for instance, Helmreich, R. L. (2000). ’On error management: Lessons from aviation.’ British Medical Journal 320(7237): 781–785. Haerkens, M., M. Kox, J. Lemson, S. Houterman, J. Van Der Hoeven and P. Pickkers (2015). ’Crew resource management in the intensive care unit: A prospective 3-year cohort study.’ Acta Anaesthesiologica Scandinavica 59(10): 1319–1329. Wahl, A. M. and T. Kongsvik (2018). ’Crew resource management training in the maritime industry: A literature review.’ WMU Journal of Maritime Affairs 17(3): 377–396. Helmreich, R. L., J. A. Wilhelm, J. R. Klinect and A. C. Merritt (2001). ’Culture, error, and crew resource management.’ Haerkens, M. H., D. H. Jenkins and J. G. van der Hoeven (2012). ’Crew resource management in the ICU: The need for culture change.’ Annals of Intensive Care 2(1): 1–5.
  35. 35FrED is iterative. See Rittel’s wicked problems (Rittel, H. W. (1972). ’On the planning crisis: Systems analysis of the ”first and second generations”.’ Bedriftsokonomen 8: 390–396). FrED has friends. Our process is just one of many to help decision making and problem solving. Others include DODAR (Diagnosis, Options, Decide, Assign Tasks, Review) and FOR-DEC (Facts, Options, Risks and Benefits—Decide, Execute, Check) from the aviation industry (see p. 167 of Orasanu-Engel, J. and K. L. Mosier (2019). Flight crew decision-making. Crew resource management. B. G. Kanki, J. Anca and T. R. Chidester. London, Academic Press: 139–183); OODA (Observe, Orient, Decide, Act); and many others. Woods identified 150 strategies used in many disciplines (Woods, D. R. (2000). ’An evidence based strategy for problem solving.’ Journal of Engineering Education 89(4): 443–459).
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset