Chapter Nine

Go!

So, having framed, explored, and decided, we have now identified our strategy. Or, have we? Hard to tell, under uncertainty. Chapter 9 shows how you can better manage uncertainty through adopting a probabilistic mindset. In addition, it shows how to further improve your problem solving by honing your skills and focusing on the process rather than the outcome. The chapter concludes showing how FrED can help you make big decisions in just a few minutes.

Let’s first look at managing uncertainty. Despite our best efforts, it is likely that our conclusions still partly rely on assumptions. Not only that, but circumstances might also have changed during the problem-solving process; after all, it’s not because we have finished our analysis that the environment has stopped evolving or that new evidence has stopped appearing.

A loop diagram depicts FrED.

So, how should we deal with changing circumstances? Continuing with a strategy when we should change is called ‘plan continuation bias’ or PCE (for ‘plan continuation error’); it’s a prevalent bias at the origin of the Torrey Canyon oil spill and a plethora of less-visible mishaps.

The Torrey Canyon supertanker fails to update its course

In the early morning of Saturday, 18 March, 1967, the first officer of the supertanker Torrey Canyon corrected the ship’s course after realising that it wasn’t where it was supposed to be. The 300-metre ship was carrying 120,000 tonnes of crude oil north past the Scilly Isles, just off the coast in South West England.

When sleep-deprived captain Rugiati woke, he countermanded the first officer’s change.1 He was under a tight deadline to reach his destination at high tide, and the two-hour detour that the first officer had initiated might mean waiting for five days for a new window. So Captain Rugiati stuck to his original plan, going through the Scilly Isles instead of around them. This decision had a monumental environmental impact as the ship ran aground spilling its cargo of crude over a 300-km stretch of British and French coasts in what was at the time the world’s worst oil spill.

Aside from time pressure, the Torrey Canyon also suffered from other issues, including a less than ideal navigation system and poor equipment design. Yet, a chief cause of the accident appeared to be Captain Rugiati’s slowness to adjust in light of new evidence. He stuck to his original plan for far too long, and when he finally updated it, it was too late to avert the accident.2

Overconfidence is a primary cause of PCEs. For instance, a study of accidents involving French military aircraft found that, in a majority of cases (54%), the crew persevered not because they failed to process information but because they trusted too much their ability to manage risk. What’s more, we don’t get out of these conditions on our own: 80% of the recoveries from perseveration included outside intervention.3

The aviation community treats errors as symptoms of deeper issues,4 and it takes systematic, evidence-based steps to reduce them. So we might be well inspired to learn from it. For instance, setting personal weather minimums under which a pilot would cancel an ongoing landing procedure is a deliberate practice.5 What would be an equivalent mitigation approach for strategic decision making in organisations? Well, adopting a two-pillar approach might help:

  • Predetermine a rule to continue with the current strategy: For instance, you might ask yourself, what would it take to change my mind (that, say, our chosen strategy remains the best course of action)? What early indicators of success (or lack thereof) would we expect to see within a few days, weeks or months of rolling out our strategy? Be especially clear on the data that would lead you to decrease your confidence in your conclusions or even change your mind as to how to move forward. This is the data that we don’t want to see but that we need to see.
  • When conditions change, stick to that predetermined rule: Although it might sound trivial that we ought to adhere to the rule that we developed for ourselves ahead of stressful situations, empirical evidence shows that in the heat of the moment we tend to throw caution to the wind. Some analyses of pilots, for instance, showed that nearly everyone (96.4% of test subjects) violated the rules that they had set for themselves ahead of the flight, choosing instead to adopt a more dangerous course of action.6 To help you be one of the remaining 3.6%, remember that you can enlist the help of a commitment device, such as a Ulysses’ contract (see Chapter 1).

 ADOPT A PROBABILISTIC MINDSET

Le doute n’est pas une condition agréable, mais la certitude est absurde.7

– Voltaire (Lettre à Frederick II de Prusse 6 avril 1767)

Having gone through a FrED cycle, evaluate your confidence that the strategy you selected is what you ought to do, from 0 (not at all confident) to 100 (absolutely sure).

Quantify your confidence (or lack thereof)

In philosopher and mathematician Bertrand Russell’s words, ‘Everything is vague to a degree you do not realise till you have tried to make it precise’. That observation is particularly salient for the evaluation of probability. In fact, people interpret likelihood labels (e.g. very likely, unlikely, roughly even odds) in different ways, sometimes vastly different ways, so you will be well served to evaluate your confidence numerically (from 0 to 100).8

Almost no chance

Very unlikely

Unlikely

Roughly even chance

Likely

Very likely

Almost certain(ly)

1–5%

5–20%

20–45%

45–55%

55–80%

80–95%

95–99%

Adapted from Office of the Director of National intelligence

Some detractors of this approach often point out that numerical estimates aren’t particularly intuitive to most people. However, Dartmouth’s professor of government Jeff Friedman offers a different take: ‘Most people are between the ages of 0 and 100. When you see a stranger, you can usually narrow down a plausible range for that person’s age. Maybe, all else being equal, you think that person is between 30 and 50 years old. If you have no other information, you take the midpoint, 40. Maybe that seems a bit low, so you increase your estimate to 42. I believe that few people would find that logic to be complex or unusual.

All probabilities are between 0 and 100%. When you’re asked to estimate the chances that a statement is true, you can usually narrow down a plausible range for what that probability should entail. Maybe, all else being equal, you think that probability is between 30 and 50%. If you have no other information, you take the midpoint, 40%. Maybe that seems a bit low, so you increase your estimate to 42%. From a logical standpoint, it’s really no different than estimating a stranger’s age.

So why do we intuitively think that estimating probabilities is odd? I think the answer is that we don’t have much opportunity to calibrate our judgments about uncertainty.

Over the course of your life, you get lots of feedback about what 42-year-old people look like. If I ask you to think about a 42-year-old person, you can probably conjure up a concrete image of what that entails. By contrast, uncertainty is abstract and very few people spend the time and effort to calibrate their assessments of uncertainty. Thus, it’s no surprise that people find it hard to consider what a 42% chance ‘looks like’. But that isn’t grounds for thinking that probabilistic reasoning is invalid or inappropriate: it’s just a reason to spend more time and effort figuring out how to reason in rigorous ways.9

Watch out for overconfidence

When we ask executives to rate their confidence in the conclusions of an analysis they just conducted, many report being highly confident (often at 80% or higher).

But note that your confidence in your conclusions depends on your confidence in your quest, alternatives, criteria, and evaluations – and those are multiplicative, so any weakness in your analysis transfers to your conclusions!

Quest, how should we do, 87 percent; alternatives, by pursuing, alternative 1; by pursuing, alternative 2, and by pursuing, alternative n; the criterion 1, 0.1; criterion 2, 0.3; criterion 3, 0.5; and criterion 4, 0.1, score, and rank are as follows: 50, 75, 100, 0, 80, 1; 25, 25, 75, 21, 51, 4; 100, 100, 25, 100, 61, 3; 50, 50, 75, 75, 65, 2; evaluation, 90 percent. Confidence in conclusions from 0 (not at all confident) to 100 (fully confident), 53 percent.

When we ask executives to evaluate their confidence in each of the four components of their analysis, and we highlight that they multiply to a much lower figure than their original estimate, they realise that they might suffer from overconfidence bias.

If we step back, as a problem solver, your objective should be to reach an appropriate level of (warranted!) confidence in your results. That you can do by first setting a level of confidence that you feel is appropriate for the problem at hand and, second, iterate through consecutive FrED cycles, using each iteration to address the weakest point(s) in your analysis. Watch out! That means that you will need to change (some of) the conclusions you reached in previous iterations. Your confirmation bias will try its best to make you stick to what you thought before. Advising the Caltech 1974 graduating class on being good scientists, Nobel physicist Richard Feynman said, ‘The first principle is that you must not fool yourself – and you are the easiest person to fool’.10 What is true for scientists also applies to problem solvers, so you’ll have to actively engage in debiasing techniques (more on that below). In short, take pride in changing your mind in light of new evidence. In fact, assume that if you don’t change your mind drastically over FrED iterations, your analysis has major weaknesses.

Manage uncertainty

With this approach, your goal is to become continuously less wrong, which means you need to manage uncertainty. To be clear, as a manager, you always face uncertainty; your job is not to eradicate it, but to manage it.11 You should take calculated risks, balancing reaching better conclusions (by going through another iteration of FrED) with implementing whatever strategy you’ve developed so far.

Note that managing uncertainty doesn’t necessarily mean reducing it. Of course, all other things being equal, less uncertainty is better. But all other things aren’t equal; for one, reducing uncertainty requires running more analysis, which is costly. Also, running more analysis has an opportunity cost, and if results are too long to come, they might arrive after your window of opportunity has closed (recall Boeing’s predicament in the prologue).

So, you shouldn’t aim at being sure that you’ve found the ‘right’ answer, but at being reasonably confident that you’ve found a fantastic/excellent/good-enough answer.12 Recognising that a good-enough answer executed swiftly often beats a brilliant one implemented slowly, you might decide that, for some of your problems, you’re better served shooting for 60% confidence in your strategy rather than a 90%.13

A diagram shows the FrED framework in circular form labelled, what it might feel like and the FrED framework in spiral framework is labelled what is happening. Target confidence level for conclusions, conclusions, V3; V2; and current level of confidence (based on analysis so far).

Adopting such a probabilistic mindset requires embracing failures. If you move ahead with less than a 100% confidence in your results – as you should – expect that you will be wrong at times. And that’s okay. Think of your decisions as a portfolio. Some will be excellent, others will be acceptable, others still won’t be glorious. But, as a whole, your portfolio will be all right.

A corollary is that you’ll have failures to learn from. Yes, failures are painful, but they provide a fertile ground for learning, possibly more fertile than successes.14 Mastery takes failing. Nobody sat down in front of a piano for the first time and played Schubert’s Forelle flawlessly. In that sense, failing isn’t the opposite of succeeding, but an integral part of succeeding. What matters is to keep the price of failing manageable, taking appropriate risks.15 That’s achievable in two ways: reducing the probability of failures and reducing their impact.

Reduce the probability of failures

If you’re overconfident, you’ll take risks that you don’t suspect, making failures more probable. Similarly, if you’re overconfident, you might spend too much time strategising at the expense of executing, which can also result in failures. So, reducing the probability of failures requires that you calibrate your confidence level to your abilities.16

A ability versus confidence matrix.

One way to reduce overconfidence is to not trust our intuition but, rather, test our intuition, continuously aiming to prove it wrong. To do that, ask yourself ‘what evidence would change my mind?’ and then seek that opposing evidence. In the end, your preferred answers shouldn’t be the ones that have the most supporting evidence, but those that best withstand the strongest critical-thinking attacks.

A good way to do so might be to develop what psychologist Adam Grant calls a challenge network, a group of people you trust to point out your blind spots.17

Reduce the impact of failures when they occur

No matter how hard we try to prevent them, failures will occur. So, we can’t just rely on avoiding them, we must also be good at correcting them. Indeed, there is growing recognition that error prevention must be complemented with error management: approaches to effectively deal with errors after they have occurred.18

Try this! Create effective debriefs

Debriefs (or ‘after-action reviews’) are used in the medical community, the aviation industry, the US army and countless other settings to promote experiential learning through systematising reflection, discussion, and goal setting. Research has shown that they can improve individual and team effectiveness significantly.19 So it might be beneficial to set up effective debriefs.

From a 2013 meta-analysis, psychologists Tannenbaum and Cerasoli observed that aligning participants, intent, and measurement yields the greatest effects, but even ‘misaligned’ debriefs demonstrate reasonable efficacy.20

Debriefs are a wonderful opportunity to praise good teamwork and point out improvement opportunities. When debriefing a negative event, it is important to emphasise what went wrong not who was wrong and how the team can prevent it from happening in the future.21

Having a documented FrED process handy will help you understand where you could have avoided mistakes. Just by looking at the matrix you can review whether you focused on a poor quest, missed worthy alternatives, omitted relevant criteria, evaluated the alternatives poorly, or misread trade-offs. In short, a well-documented FrED enables you to be more granular in your after-action review. Conversely, if all you have to look back to is a description of the one idea that you recommended and why it is great, it is frequently impossible to retrace the decision process to understand where things went sideways.

Gaining this FrED-powered, in-depth understanding can help you identify whether suboptimal results originated from mistakes or bad luck. Based on this insight, you can work on your systems and processes to mitigate the avoidable mistakes going forward; say, by investing more in exploring alternatives, being more thoughtful with your decision criteria, or working through the trade-offs more deliberately.

 UPDATE YOUR THINKING

So far, all we have to show for our problem-solving efforts is a hypothesis of what we think might be a good solution (our strategy). We ought to test that hypothesis, updating our thinking as new evidence surfaces. In line with the scientific thinking that has guided us throughout this book, our strategy should be a hypothesis that we continuously update. And, in case you’re wondering whether that’s worth all the trouble, research has shown that entrepreneurs who are trained to think like scientists have better outcomes; so, yes!22

Adopt a Bayesian worldview

In probability theory, Bayesian thinking consists of updating one’s thinking in light of new evidence. We probably lost half of you at ‘probability’ in the previous phrase but, if you’re still reading, fear not, dear reader, as no equations are coming. The good news is you can derive some benefits of Bayesian thinking by just adopting it as a mindset.

Applied epistemologist Tim van Gelder makes a compelling case for shifting our problem-solving approach from a Boolean worldview (where everything is deterministic: either true or false; right or wrong) to a ­Bayesian one (everything is more or less probable from 0 (impossible) to 100 (certain)).23

Adopting a Bayesian worldview is useful throughout FrED. For instance, we started with a question that we thought would be a great quest. But diagnosing it brought in new evidence that led us to reframe it. Similarly, we thought of only one answer for our quest, but developing a how map brought in new evidence that resulted in wider solution space. Updating also helped us identify better criteria and improve our evaluations.

In fact, we should continue with constantly updating our thinking even after we reached a conclusion: As we implement our strategy, we might get new evidence showing a gap between what we predicted would happen and what actually happened. This should trigger revisiting how to proceed: maybe you’ll change strategy, maybe you won’t. But if you stay with the current one, that will be the result of a conscious decision, not inertia.

Improve your crystal ball

If you knew what the future looks like, you would make consistently better decisions. So, as a decision maker, you must make predictions. But as the ­Danish proverb goes, ‘it’s difficult to make predictions, especially about the future’.

Ideally, you are well calibrated, able to make a good initial guess on even an unfamiliar subject and updating your thinking appropriately in light of new evidence. But if you can only have one of these traits, we would argue that it is better to have the latter. An ability to recognise when you’re off and course correct appropriately – that is, in so many words – is invaluable.

A matrix to improve the crystal ball.

Our initial guess – called our prior – can throw us off in two ways. If we have too strong a prior, we will need massive amounts of opposing evidence to change our mind. This can happen to the best of us: Einstein was so set on the universe being static that he needed to add a parameter – the cosmological constant – in the theory of general relativity.24 Conversely, too weak a prior means we need massive supporting evidence to accept it. This evidence might not be available, thereby hindering progress, as it has with various scientific advances.25

For many of us, observing evidence that differs from what we expect is a point where we question evidence, we ignore it, we double down on our original path, or we obfuscate and distract (not you, of course, dear reader, but think of a politician, whichever one). A quote often attributed to ­Winston Churchill goes, ‘Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened.’

A diagram depicts Bayesian thinking.

Instead, we should integrate this new information to update our original thinking. But, as we discussed, this doesn’t mean that we ought to delay implementation until we know for sure what’s going to happen. The challenge is balancing preparing for an uncertain future with getting things done. By thinking of our strategy as a hypothesis, one that we continuously update in light of new evidence, we reduce the gap between strategy and execution as we see these elements as interdependent activities that will benefit from greater integration.26

There are many practical implications of adopting a Bayesian worldview. Perhaps most important: If you have low confidence in some components of your decision (quest, alternatives, criteria, or evaluations), build contingency plans in your implementation so that you can course correct swiftly as you go.

Front load your implementation efforts with two-way-door decisions

One way to categorise decisions is to separate them as one-way-door and two-way-door decisions. One-way-door decisions, once made, are difficult or impossible to modify. Once you walk through that door, it shuts closed behind you and there’s no handle to let you back in. Think of selling your company, quitting your job, squeezing the toothpaste out of the tube, jumping off that plane (with a parachute, of course!). Once you’ve jumped off the plane, it’s not entirely trivial to get back into the plane.

Irreversible decision?

In 1929, the Indiana Bell Telephone Company bought an eight-story building so that they could destroy it and build a larger headquarter in its stead. But architect Kurt Vonnegut Sr (yes, the father of the novelist) suggested an alternative: move the building to make room for expansion.

Over the course of a month, using a concrete mat and hydraulic jacks and rollers, the 11,000-tonne building was shifted 16 metres south and rotated 90 degrees. Even what appears as the ultimate irreversible decision – choosing where to place a building – wasn’t so definitive in the end: with some effort, even that decision could be changed.27

But many of our decisions are ‘two-way doors’. With a little effort, they can be changed or reversed.28 If you structure your implementation effort by front-loading it with two-way-door decisions, you retain flexibility in your plan when the uncertainty is the highest – and, therefore, your need to pivot quickly is the greatest.

Equally useful is to turn seemingly one-way door decisions into two-way doors. Consider how Richard Branson started his airline: ‘When we launched Virgin Atlantic I made a deal with Boeing that we could hand the plane back in a year’s time if the airline didn’t get off the ground. Thankfully, we never had to. But if things hadn’t worked out, I could have walked back through the door.’ Branson also reduced the risks by leasing a ­second-hand 747 rather than a new one.29

 HONE THE SKILLS

Like improving any higher-order skill, improving your ability to solve complex problems requires deliberately investing into the process while leveraging timely, constructive feedback.

Train

Research shows that training can effectively improve problem-solving skills, and, ultimately, team performance,30 so you might want to consciously invest in improving your skills. Even if you aren’t in a position to develop an organisation-wide programme, you can still progress. For instance, you might want to test how much you suffer from, say, overconfidence.31 ClearerThinking.org provides some tools to help you self-assess.32 Based on these insights, you can make corrections, such as changing how much you rely on your instinct.

The role of effective feedback to support learning has been extensively established.33 So, you may want to experiment with FrED in a low-risk environment to generate such feedback. Just as pilots and surgeons use simulators to hone their skills in environments where mistakes bear no consequences, find low-stake projects that you could use as your simulators. In fact, you may use those to also develop your own checklists of what you need to pay attention to when the pressure increases.34

Build problem-solving habits

The tools in this book will become particularly helpful if you manage to turn them into thinking habits. Instead of pulling out the book to consult the respective chapters, once you use FrED on a few challenges, the core ideas will become second nature to you.

Think back to when you learned how to ride a bicycle. For most of us that required patience and persistence, and the occasional beat-up knee. Be similarly persistent with FrED. Your journey to better problem solving starts with realising that there are ways to approach complex problems differently than you’ve approached them. That doesn’t mean yet that you know how to follow these ways, just like having a bicycle doesn’t mean that you know how to ride it. Once you’ve created that awareness, it pays off to consciously use the tools until you firmly grasp them, just like you practised riding your bicycle for a while before it became natural. At some point, it will become second nature.

A matrix for competence versus awareness.

But beware, if you stop being attentive before you have actually ingrained the new habit, you are likely to go back to the old way of doing things. This happens with some students who thoroughly enjoy using FrED while they are in class but, once they return home, they revert to their old intuitive ways.

 FOCUS ON THE PROCESS, NOT THE OUTCOME

The success of our problem-solving efforts doesn’t just depend on what we put into it; luck also plays a role, which is why judging a problem-solving approach by its outcome isn’t advisable.35 After all, even a broken clock is right twice a day; we all get lucky every now and then, even when we follow a poor process. So we might get away with a poor process, but eventually the odds catch up.

Practically, you can use FrED to focus on the process. Keeping in sight the four components of your conclusion – the quest, the alternatives, the criteria, and the evaluations – you can assess whether your process is robust enough given the constraints you face (more on that in the next section).

Another critical skill to acquire is an ability to develop multiple, and potentially opposing, mental models of what solutions could look like.36 Keeping an open mind during the solution process enables you to evolve these models – keeping what’s useful and eliminating what is not – before you come to a conclusion. Author F. Scott Fitzgerald pointed to the importance of this skill when he wrote, ‘The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.’37

Although sticking to a deliberate process is important to manage time and mental space, be mindful to not overengineer your process. We sometimes see executives invest disproportionate effort and ultimately fail to develop a solution because they run out of time.

 HOW TO SOLVE COMPLEX PROBLEMS IN FIVE MINUTES

FrED can be a good guide if you have weeks or months to solve your problem, but every now and then, you walk into a meeting room to discover that you must make a high-stake decision on the spot. What should you do if you only have a few minutes to think through a complex problem?

Well, FrED’s usefulness remains, as you can use it to mentally check the validity of your thinking. Under time pressure you won’t have the ability to go deep into any of the steps, but you can still run through all of them at a high level by asking yourself:

  • Are we focusing on a good quest? (frame)
  • Are we considering enough alternatives? (explore)
  • Are we using an appropriate set of evaluation criteria? (explore)
  • Are we using good-enough evaluations? (decide)

These questions also come in handy any time you find yourself in a heated debate about the best way forward. Instead of digging your heels in your preferred alternative (‘I failed to convince them so far but, surely, if I shout!?’), asking yourself these questions will help you understand your interlocutor’s goals, how she thinks about the alternatives, and what she is willing to trade off – three key insights to have before getting into a debate about what’s the best way forward.

Of course, if all that fails, there’s always shouting.

An illustration shows the FrED framework for complex problems.

 CHAPTER TAKEAWAYS

The quality of your analysis depends on the quality of the quest, alternatives, criteria, and evaluations. These aren’t compensatory, so you need a minimum level for each.

Calibrate your confidence on the quality of your analysis: don’t be too confident simply because you’ve run an analysis.

Rather than trusting your intuition, test it. Be evidence-based, looking primarily for opposing evidence.

Chances are you are wrong about a lot of what you think. Nothing personal, we all are! Identify what you can do about it. In short, be open-minded. Given your specific constraints, try to find a good balance between learning more (being less wrong) and moving forward.

Use FrED iteratively, focusing each cycle on the weakest part of your analysis and adopting a Bayesian approach: update your thinking in light of new evidence. Don’t hesitate to change your mind if the evidence warrants it!

Even when there is no time to use the tools fully, following FrED can help you think in a more structured way; use it as a roadmap.

 CHAPTER 9 NOTES

  1.   1Espresso, anyone? Although sleep deprivation damages your decision-making abilities, it appears that caffeine mitigates these negative effects. See Killgore, W. D., G. H. Kamimori and T. J. Balkin (2011). ’Caffeine protects against increased risk-taking propensity during severe sleep deprivation.’ Journal of Sleep Research 20(3): 395–403.
  2.   2It’s not just beauty sleep. In the Torrey Canyon accident, notice how Captain Rugiati was sleep deprived, which might have contributed to his unwillingness to adjust his thinking. Experiments have shown that sleep deprivation can result, among many other things, in irrational risk taking. See Barnes, C. M. and N. F. Watson (2019). ’Why healthy sleep is good for business.’ Sleep Medicine Reviews 47: 112–118. See Chauvin, C. (2011). ’Human factors and maritime safety.’ The Journal of Navigation 64(4): 625. Harford, T. (2019). Brexit lessons from the wreck of the Torrey Canyon. Financial Times. Rothblum, A. M. (2000). Human error and marine safety. National Safety Council Congress and Expo, Orlando, FL.
  3.   3Bourgeon, L., C. Valot, A. Vacher and C. Navarro (2011). Study of perseveration behaviors in military aeronautical accidents and incidents: Analysis of plan continuation errors. Proceedings of the Human Factors and Ergonomics Society annual meeting, SAGE Publications Sage CA: Los Angeles, CA.
  4.   4For references, see p. 764 of Miranda, A. T. (2018). ‘Understanding human error in naval aviation mishaps.’ Human Factors 60(6): 763–777.
  5.   5Winter, S. R., S. Rice, J. Capps, J. Trombley, M. N. Milner, E. C. Anania, N. W. Walters and B. S. Baugh (2020). ‘An analysis of a pilot’s adherence to their personal weather minimums.’ Safety Science 123: 104576.
  6.   6Winter, S. R., S. Rice, J. Capps, J. Trombley, M. N. Milner, E. C. Anania, N. W. Walters and B. S. Baugh (2020). ’An analysis of a pilot’s adherence to their personal weather minimums.’ Safety Science 123: 104576.
  7.   7Uncertainty is an uncomfortable position, but certainty is an absurd one.
  8.   8See, for instance, Office of the Director of National Intelligence (2015). Analytic standards. Intelligence community directive 203. Dhami, M. K. and D. R. Mandel (2021). ’Words or numbers? Communicating probability in intelligence analysis.’ American Psychologist 76(3): 549. Beyth-Marom, R. (1982). ’How probable is probable? A numerical translation of verbal probability expressions.’ Journal of Forecasting 1(3): 257–269. Wintle, B. C., H. Fraser, B. C. Wills, A. E. Nicholson and F. Fidler (2019). ’Verbal probabilities: Very likely to be somewhat more confusing than numbers.’ PloS One 14(4): e0213522. Also, see pp. 25–26 of National Research Council (2006). Completing the forecast: Characterizing and communicating uncertainty for better decisions using weather and climate forecasts, National Academies Press, Office of the Director of National Intelligence (2015). Analytic standards. Intelligence community directive 203. See also pp. 84–85 of National Research Council (2011). Intelligence analysis: behavioral and social scientific foundations. Washington, DC, National Academies Press.
  9.   9Friedman, J. (2020). ’Analytic rigour is improved by probabilistic thinking and communication.’
  10. 10Feynman, R. P. (1974). ’Cargo Cult Science.’ Engineering and Science 37(7): 10–13.
  11. 11Another take on your job as a manager. We contend that your job, as a manager, isn’t to eradicate uncertainty but to manage it. Management scholar Roger Martin phrases it slightly differently: ‘the objective is not to eliminate risk but to increase the odds of success’ (Martin 2014).
  12. 12This relates to the idea of a requisite decision model: a model is considered requisite when it provides enough guidance to decide upon a course of action. Phillips, L. D. (1984). ’A theory of requisite decision models.’ Acta Psychologica 56(1-3): 29–48. See also pp. 55–56 of Goodwin, P. and G. Wright (2014). Decision analysis for management judgment, John Wiley & Sons.
  13. 13For practical ways to avoid bottlenecks in organisational decision making, see Rogers, P. and M. Blenko (2006). ’Who has the D.’ Harvard Business Review 84(1): 52–61.
  14. 14Failing is an integral part of succeeding. For a discussion of how to frame failure in a positive way, see pp. 160–164 of Milkman, K. (2021). How to change: The science of getting from where you are to where you want to be. London, Vermilion.
  15. 15For practical suggestions on how organisations can re-balance their risk portfolios, see Lovallo, D., T. Koller, R. Uhlaner and D. Kahneman (2020). ’Your company is too risk averse: Here’s why and what to do about it.’ Harvard Business Review 98(2): 104–111.
  16. 16For more on confidence calibration, see Moore, D. A. (2021). ’Perfectly confident leadership.’ California Management Review 63(3): 58–69.
  17. 17See Chapter 4 of Grant, A. (2021). Think again: The power of knowing what you don’t know. New York, Viking.
  18. 18See Frese, M. and N. Keith (2015). ’Action errors, error management, and learning in organizations.’ Annual Review of Psychology 66: 661–687.
  19. 19Tannenbaum, S. I. and C. P. Cerasoli (2013). ’Do team and individual debriefs enhance performance? A meta-analysis.’ Human Factors: The Journal of the Human Factors and Ergonomics Society 55(1): 231–245.
  20. 20Ibid.
  21. 21See p. 65 of Tullo, F. J. (2010). Teamwork and organizational factors. Crew resource management, Second edition. Barbara Kanki, Robert Helmreich and J. Anca. London, Elsevier: 59–78.
  22. 22Camuffo, A., A. Cordova, A. Gambardella and C. Spina (2020). ’A scientific approach to entrepreneurial decision making: Evidence from a randomized control trial.’ Management Science 66(2): 564–586.
  23. 23van Gelder, T. (2014). Do you hold a Bayesian or a Boolean worldview? The Age. Melbourne.
  24. 24Nussbaumer, H. (2014). ’Einstein’s conversion from his static to an expanding universe.’ The European Physical Journal H 39(1): 37–62.
  25. 25Bang, D. and C. D. Frith (2017). ’Making better decisions in groups.’ Royal Society Open Science 4(8): 170193.
  26. 26Edmondson, A. and P. Verdin (2017). ’Your strategy should be a hypothesis you constantly adjust.’ Harvard Business Review.
  27. 27Aldrich, S. (2010). Kurt Vonnegut’s Indianapolis. National Geographic.
  28. 28Gregersen, H. (2021). ’When a leader like Bezos steps down, can innovation keep up?’ Sloan Management Review.
  29. 29The Telegraph (2018). Sir Richard Branson: The business of risk. From https://www.youtube.com/watch?v=-49524mB49520gY.
  30. 30McEwan, D., G. R. Ruissen, M. A. Eys, B. D. Zumbo and M. R. Beauchamp (2017). ’The effectiveness of teamwork training on teamwork behaviors and team performance: A systematic review and meta-analysis of controlled interventions.’ PloS One 12(1): e0169604.
  31. 31Incompetent and unaware. People tend to overestimate their abilities in many social and intellectual domains. See Kruger, J. and D. Dunning (1999). ‘Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments.’ Journal of Personality and Social Psychology 77(6): 1121.
  32. 32Clearer Thinking. (2021). ’Make better decisions.’ Retrieved 30 July, 2021, from https://www.clearerthinking.org.
  33. 33See, for instance, pp. 52–53 of National Research Council (2011). Intelligence analysis for tomorrow: Advances from the behavioral and social sciences. Washington, DC, National Academies Press.
  34. 34For an example of the usefulness of checklists, see Gawande, A. (2007). The checklist. For an in-depth discussion, see Gawande, A. (2009). The checklist manifesto. New York, Picador.
  35. 35Don’t judge by the outcome. Evaluating a decision by its outcome rather than its process is called outcome bias or ‘resulting’. For more, see Baron, J. and J. C. Hershey (1988). ’Outcome bias in decision evaluation.’ Journal of Personality and Social Psychology 54(4): 569; and pp. 1–24 of Duke, A. (2020). How to decide: Simple tools for making better choices, Penguin.
  36. 36For a detailed, hands-on approach on the usefulness of mental models, see Martin, R. L. (2009). The opposable mind: How successful leaders win through integrative thinking, Harvard Business Press.
  37. 37Fitzgerald, F. S. (1936). The crack-up. Esquire.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset