So, having framed, explored, and decided, we have now identified our strategy. Or, have we? Hard to tell, under uncertainty. Chapter 9 shows how you can better manage uncertainty through adopting a probabilistic mindset. In addition, it shows how to further improve your problem solving by honing your skills and focusing on the process rather than the outcome. The chapter concludes showing how FrED can help you make big decisions in just a few minutes.
Let’s first look at managing uncertainty. Despite our best efforts, it is likely that our conclusions still partly rely on assumptions. Not only that, but circumstances might also have changed during the problem-solving process; after all, it’s not because we have finished our analysis that the environment has stopped evolving or that new evidence has stopped appearing.
So, how should we deal with changing circumstances? Continuing with a strategy when we should change is called ‘plan continuation bias’ or PCE (for ‘plan continuation error’); it’s a prevalent bias at the origin of the Torrey Canyon oil spill and a plethora of less-visible mishaps.
The Torrey Canyon supertanker fails to update its course
In the early morning of Saturday, 18 March, 1967, the first officer of the supertanker Torrey Canyon corrected the ship’s course after realising that it wasn’t where it was supposed to be. The 300-metre ship was carrying 120,000 tonnes of crude oil north past the Scilly Isles, just off the coast in South West England.
When sleep-deprived captain Rugiati woke, he countermanded the first officer’s change.1 He was under a tight deadline to reach his destination at high tide, and the two-hour detour that the first officer had initiated might mean waiting for five days for a new window. So Captain Rugiati stuck to his original plan, going through the Scilly Isles instead of around them. This decision had a monumental environmental impact as the ship ran aground spilling its cargo of crude over a 300-km stretch of British and French coasts in what was at the time the world’s worst oil spill.
Aside from time pressure, the Torrey Canyon also suffered from other issues, including a less than ideal navigation system and poor equipment design. Yet, a chief cause of the accident appeared to be Captain Rugiati’s slowness to adjust in light of new evidence. He stuck to his original plan for far too long, and when he finally updated it, it was too late to avert the accident.2
Overconfidence is a primary cause of PCEs. For instance, a study of accidents involving French military aircraft found that, in a majority of cases (54%), the crew persevered not because they failed to process information but because they trusted too much their ability to manage risk. What’s more, we don’t get out of these conditions on our own: 80% of the recoveries from perseveration included outside intervention.3
The aviation community treats errors as symptoms of deeper issues,4 and it takes systematic, evidence-based steps to reduce them. So we might be well inspired to learn from it. For instance, setting personal weather minimums under which a pilot would cancel an ongoing landing procedure is a deliberate practice.5 What would be an equivalent mitigation approach for strategic decision making in organisations? Well, adopting a two-pillar approach might help:
Le doute n’est pas une condition agréable, mais la certitude est absurde.7
– Voltaire (Lettre à Frederick II de Prusse 6 avril 1767)
Having gone through a FrED cycle, evaluate your confidence that the strategy you selected is what you ought to do, from 0 (not at all confident) to 100 (absolutely sure).
In philosopher and mathematician Bertrand Russell’s words, ‘Everything is vague to a degree you do not realise till you have tried to make it precise’. That observation is particularly salient for the evaluation of probability. In fact, people interpret likelihood labels (e.g. very likely, unlikely, roughly even odds) in different ways, sometimes vastly different ways, so you will be well served to evaluate your confidence numerically (from 0 to 100).8
Almost no chance | Very unlikely | Unlikely | Roughly even chance | Likely | Very likely | Almost certain(ly) |
---|---|---|---|---|---|---|
1–5% | 5–20% | 20–45% | 45–55% | 55–80% | 80–95% | 95–99% |
Some detractors of this approach often point out that numerical estimates aren’t particularly intuitive to most people. However, Dartmouth’s professor of government Jeff Friedman offers a different take: ‘Most people are between the ages of 0 and 100. When you see a stranger, you can usually narrow down a plausible range for that person’s age. Maybe, all else being equal, you think that person is between 30 and 50 years old. If you have no other information, you take the midpoint, 40. Maybe that seems a bit low, so you increase your estimate to 42. I believe that few people would find that logic to be complex or unusual.
All probabilities are between 0 and 100%. When you’re asked to estimate the chances that a statement is true, you can usually narrow down a plausible range for what that probability should entail. Maybe, all else being equal, you think that probability is between 30 and 50%. If you have no other information, you take the midpoint, 40%. Maybe that seems a bit low, so you increase your estimate to 42%. From a logical standpoint, it’s really no different than estimating a stranger’s age.
So why do we intuitively think that estimating probabilities is odd? I think the answer is that we don’t have much opportunity to calibrate our judgments about uncertainty.
Over the course of your life, you get lots of feedback about what 42-year-old people look like. If I ask you to think about a 42-year-old person, you can probably conjure up a concrete image of what that entails. By contrast, uncertainty is abstract and very few people spend the time and effort to calibrate their assessments of uncertainty. Thus, it’s no surprise that people find it hard to consider what a 42% chance ‘looks like’. But that isn’t grounds for thinking that probabilistic reasoning is invalid or inappropriate: it’s just a reason to spend more time and effort figuring out how to reason in rigorous ways.9
When we ask executives to rate their confidence in the conclusions of an analysis they just conducted, many report being highly confident (often at 80% or higher).
But note that your confidence in your conclusions depends on your confidence in your quest, alternatives, criteria, and evaluations – and those are multiplicative, so any weakness in your analysis transfers to your conclusions!
When we ask executives to evaluate their confidence in each of the four components of their analysis, and we highlight that they multiply to a much lower figure than their original estimate, they realise that they might suffer from overconfidence bias.
If we step back, as a problem solver, your objective should be to reach an appropriate level of (warranted!) confidence in your results. That you can do by first setting a level of confidence that you feel is appropriate for the problem at hand and, second, iterate through consecutive FrED cycles, using each iteration to address the weakest point(s) in your analysis. Watch out! That means that you will need to change (some of) the conclusions you reached in previous iterations. Your confirmation bias will try its best to make you stick to what you thought before. Advising the Caltech 1974 graduating class on being good scientists, Nobel physicist Richard Feynman said, ‘The first principle is that you must not fool yourself – and you are the easiest person to fool’.10 What is true for scientists also applies to problem solvers, so you’ll have to actively engage in debiasing techniques (more on that below). In short, take pride in changing your mind in light of new evidence. In fact, assume that if you don’t change your mind drastically over FrED iterations, your analysis has major weaknesses.
With this approach, your goal is to become continuously less wrong, which means you need to manage uncertainty. To be clear, as a manager, you always face uncertainty; your job is not to eradicate it, but to manage it.11 You should take calculated risks, balancing reaching better conclusions (by going through another iteration of FrED) with implementing whatever strategy you’ve developed so far.
Note that managing uncertainty doesn’t necessarily mean reducing it. Of course, all other things being equal, less uncertainty is better. But all other things aren’t equal; for one, reducing uncertainty requires running more analysis, which is costly. Also, running more analysis has an opportunity cost, and if results are too long to come, they might arrive after your window of opportunity has closed (recall Boeing’s predicament in the prologue).
So, you shouldn’t aim at being sure that you’ve found the ‘right’ answer, but at being reasonably confident that you’ve found a fantastic/excellent/good-enough answer.12 Recognising that a good-enough answer executed swiftly often beats a brilliant one implemented slowly, you might decide that, for some of your problems, you’re better served shooting for 60% confidence in your strategy rather than a 90%.13
Adopting such a probabilistic mindset requires embracing failures. If you move ahead with less than a 100% confidence in your results – as you should – expect that you will be wrong at times. And that’s okay. Think of your decisions as a portfolio. Some will be excellent, others will be acceptable, others still won’t be glorious. But, as a whole, your portfolio will be all right.
A corollary is that you’ll have failures to learn from. Yes, failures are painful, but they provide a fertile ground for learning, possibly more fertile than successes.14 Mastery takes failing. Nobody sat down in front of a piano for the first time and played Schubert’s Forelle flawlessly. In that sense, failing isn’t the opposite of succeeding, but an integral part of succeeding. What matters is to keep the price of failing manageable, taking appropriate risks.15 That’s achievable in two ways: reducing the probability of failures and reducing their impact.
If you’re overconfident, you’ll take risks that you don’t suspect, making failures more probable. Similarly, if you’re overconfident, you might spend too much time strategising at the expense of executing, which can also result in failures. So, reducing the probability of failures requires that you calibrate your confidence level to your abilities.16
One way to reduce overconfidence is to not trust our intuition but, rather, test our intuition, continuously aiming to prove it wrong. To do that, ask yourself ‘what evidence would change my mind?’ and then seek that opposing evidence. In the end, your preferred answers shouldn’t be the ones that have the most supporting evidence, but those that best withstand the strongest critical-thinking attacks.
A good way to do so might be to develop what psychologist Adam Grant calls a challenge network, a group of people you trust to point out your blind spots.17
No matter how hard we try to prevent them, failures will occur. So, we can’t just rely on avoiding them, we must also be good at correcting them. Indeed, there is growing recognition that error prevention must be complemented with error management: approaches to effectively deal with errors after they have occurred.18
Try this! Create effective debriefs
Debriefs (or ‘after-action reviews’) are used in the medical community, the aviation industry, the US army and countless other settings to promote experiential learning through systematising reflection, discussion, and goal setting. Research has shown that they can improve individual and team effectiveness significantly.19 So it might be beneficial to set up effective debriefs.
From a 2013 meta-analysis, psychologists Tannenbaum and Cerasoli observed that aligning participants, intent, and measurement yields the greatest effects, but even ‘misaligned’ debriefs demonstrate reasonable efficacy.20
Debriefs are a wonderful opportunity to praise good teamwork and point out improvement opportunities. When debriefing a negative event, it is important to emphasise what went wrong not who was wrong and how the team can prevent it from happening in the future.21
Having a documented FrED process handy will help you understand where you could have avoided mistakes. Just by looking at the matrix you can review whether you focused on a poor quest, missed worthy alternatives, omitted relevant criteria, evaluated the alternatives poorly, or misread trade-offs. In short, a well-documented FrED enables you to be more granular in your after-action review. Conversely, if all you have to look back to is a description of the one idea that you recommended and why it is great, it is frequently impossible to retrace the decision process to understand where things went sideways.
Gaining this FrED-powered, in-depth understanding can help you identify whether suboptimal results originated from mistakes or bad luck. Based on this insight, you can work on your systems and processes to mitigate the avoidable mistakes going forward; say, by investing more in exploring alternatives, being more thoughtful with your decision criteria, or working through the trade-offs more deliberately.
So far, all we have to show for our problem-solving efforts is a hypothesis of what we think might be a good solution (our strategy). We ought to test that hypothesis, updating our thinking as new evidence surfaces. In line with the scientific thinking that has guided us throughout this book, our strategy should be a hypothesis that we continuously update. And, in case you’re wondering whether that’s worth all the trouble, research has shown that entrepreneurs who are trained to think like scientists have better outcomes; so, yes!22
In probability theory, Bayesian thinking consists of updating one’s thinking in light of new evidence. We probably lost half of you at ‘probability’ in the previous phrase but, if you’re still reading, fear not, dear reader, as no equations are coming. The good news is you can derive some benefits of Bayesian thinking by just adopting it as a mindset.
Applied epistemologist Tim van Gelder makes a compelling case for shifting our problem-solving approach from a Boolean worldview (where everything is deterministic: either true or false; right or wrong) to a Bayesian one (everything is more or less probable from 0 (impossible) to 100 (certain)).23
Adopting a Bayesian worldview is useful throughout FrED. For instance, we started with a question that we thought would be a great quest. But diagnosing it brought in new evidence that led us to reframe it. Similarly, we thought of only one answer for our quest, but developing a how map brought in new evidence that resulted in wider solution space. Updating also helped us identify better criteria and improve our evaluations.
In fact, we should continue with constantly updating our thinking even after we reached a conclusion: As we implement our strategy, we might get new evidence showing a gap between what we predicted would happen and what actually happened. This should trigger revisiting how to proceed: maybe you’ll change strategy, maybe you won’t. But if you stay with the current one, that will be the result of a conscious decision, not inertia.
If you knew what the future looks like, you would make consistently better decisions. So, as a decision maker, you must make predictions. But as the Danish proverb goes, ‘it’s difficult to make predictions, especially about the future’.
Ideally, you are well calibrated, able to make a good initial guess on even an unfamiliar subject and updating your thinking appropriately in light of new evidence. But if you can only have one of these traits, we would argue that it is better to have the latter. An ability to recognise when you’re off and course correct appropriately – that is, in so many words – is invaluable.
Our initial guess – called our prior – can throw us off in two ways. If we have too strong a prior, we will need massive amounts of opposing evidence to change our mind. This can happen to the best of us: Einstein was so set on the universe being static that he needed to add a parameter – the cosmological constant – in the theory of general relativity.24 Conversely, too weak a prior means we need massive supporting evidence to accept it. This evidence might not be available, thereby hindering progress, as it has with various scientific advances.25
For many of us, observing evidence that differs from what we expect is a point where we question evidence, we ignore it, we double down on our original path, or we obfuscate and distract (not you, of course, dear reader, but think of a politician, whichever one). A quote often attributed to Winston Churchill goes, ‘Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing had happened.’
Instead, we should integrate this new information to update our original thinking. But, as we discussed, this doesn’t mean that we ought to delay implementation until we know for sure what’s going to happen. The challenge is balancing preparing for an uncertain future with getting things done. By thinking of our strategy as a hypothesis, one that we continuously update in light of new evidence, we reduce the gap between strategy and execution as we see these elements as interdependent activities that will benefit from greater integration.26
There are many practical implications of adopting a Bayesian worldview. Perhaps most important: If you have low confidence in some components of your decision (quest, alternatives, criteria, or evaluations), build contingency plans in your implementation so that you can course correct swiftly as you go.
One way to categorise decisions is to separate them as one-way-door and two-way-door decisions. One-way-door decisions, once made, are difficult or impossible to modify. Once you walk through that door, it shuts closed behind you and there’s no handle to let you back in. Think of selling your company, quitting your job, squeezing the toothpaste out of the tube, jumping off that plane (with a parachute, of course!). Once you’ve jumped off the plane, it’s not entirely trivial to get back into the plane.
Irreversible decision?
In 1929, the Indiana Bell Telephone Company bought an eight-story building so that they could destroy it and build a larger headquarter in its stead. But architect Kurt Vonnegut Sr (yes, the father of the novelist) suggested an alternative: move the building to make room for expansion.
Over the course of a month, using a concrete mat and hydraulic jacks and rollers, the 11,000-tonne building was shifted 16 metres south and rotated 90 degrees. Even what appears as the ultimate irreversible decision – choosing where to place a building – wasn’t so definitive in the end: with some effort, even that decision could be changed.27
But many of our decisions are ‘two-way doors’. With a little effort, they can be changed or reversed.28 If you structure your implementation effort by front-loading it with two-way-door decisions, you retain flexibility in your plan when the uncertainty is the highest – and, therefore, your need to pivot quickly is the greatest.
Equally useful is to turn seemingly one-way door decisions into two-way doors. Consider how Richard Branson started his airline: ‘When we launched Virgin Atlantic I made a deal with Boeing that we could hand the plane back in a year’s time if the airline didn’t get off the ground. Thankfully, we never had to. But if things hadn’t worked out, I could have walked back through the door.’ Branson also reduced the risks by leasing a second-hand 747 rather than a new one.29
Like improving any higher-order skill, improving your ability to solve complex problems requires deliberately investing into the process while leveraging timely, constructive feedback.
Research shows that training can effectively improve problem-solving skills, and, ultimately, team performance,30 so you might want to consciously invest in improving your skills. Even if you aren’t in a position to develop an organisation-wide programme, you can still progress. For instance, you might want to test how much you suffer from, say, overconfidence.31 ClearerThinking.org provides some tools to help you self-assess.32 Based on these insights, you can make corrections, such as changing how much you rely on your instinct.
The role of effective feedback to support learning has been extensively established.33 So, you may want to experiment with FrED in a low-risk environment to generate such feedback. Just as pilots and surgeons use simulators to hone their skills in environments where mistakes bear no consequences, find low-stake projects that you could use as your simulators. In fact, you may use those to also develop your own checklists of what you need to pay attention to when the pressure increases.34
The tools in this book will become particularly helpful if you manage to turn them into thinking habits. Instead of pulling out the book to consult the respective chapters, once you use FrED on a few challenges, the core ideas will become second nature to you.
Think back to when you learned how to ride a bicycle. For most of us that required patience and persistence, and the occasional beat-up knee. Be similarly persistent with FrED. Your journey to better problem solving starts with realising that there are ways to approach complex problems differently than you’ve approached them. That doesn’t mean yet that you know how to follow these ways, just like having a bicycle doesn’t mean that you know how to ride it. Once you’ve created that awareness, it pays off to consciously use the tools until you firmly grasp them, just like you practised riding your bicycle for a while before it became natural. At some point, it will become second nature.
But beware, if you stop being attentive before you have actually ingrained the new habit, you are likely to go back to the old way of doing things. This happens with some students who thoroughly enjoy using FrED while they are in class but, once they return home, they revert to their old intuitive ways.
The success of our problem-solving efforts doesn’t just depend on what we put into it; luck also plays a role, which is why judging a problem-solving approach by its outcome isn’t advisable.35 After all, even a broken clock is right twice a day; we all get lucky every now and then, even when we follow a poor process. So we might get away with a poor process, but eventually the odds catch up.
Practically, you can use FrED to focus on the process. Keeping in sight the four components of your conclusion – the quest, the alternatives, the criteria, and the evaluations – you can assess whether your process is robust enough given the constraints you face (more on that in the next section).
Another critical skill to acquire is an ability to develop multiple, and potentially opposing, mental models of what solutions could look like.36 Keeping an open mind during the solution process enables you to evolve these models – keeping what’s useful and eliminating what is not – before you come to a conclusion. Author F. Scott Fitzgerald pointed to the importance of this skill when he wrote, ‘The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.’37
Although sticking to a deliberate process is important to manage time and mental space, be mindful to not overengineer your process. We sometimes see executives invest disproportionate effort and ultimately fail to develop a solution because they run out of time.
FrED can be a good guide if you have weeks or months to solve your problem, but every now and then, you walk into a meeting room to discover that you must make a high-stake decision on the spot. What should you do if you only have a few minutes to think through a complex problem?
Well, FrED’s usefulness remains, as you can use it to mentally check the validity of your thinking. Under time pressure you won’t have the ability to go deep into any of the steps, but you can still run through all of them at a high level by asking yourself:
These questions also come in handy any time you find yourself in a heated debate about the best way forward. Instead of digging your heels in your preferred alternative (‘I failed to convince them so far but, surely, if I shout!?’), asking yourself these questions will help you understand your interlocutor’s goals, how she thinks about the alternatives, and what she is willing to trade off – three key insights to have before getting into a debate about what’s the best way forward.
Of course, if all that fails, there’s always shouting.
The quality of your analysis depends on the quality of the quest, alternatives, criteria, and evaluations. These aren’t compensatory, so you need a minimum level for each.
Calibrate your confidence on the quality of your analysis: don’t be too confident simply because you’ve run an analysis.
Rather than trusting your intuition, test it. Be evidence-based, looking primarily for opposing evidence.
Chances are you are wrong about a lot of what you think. Nothing personal, we all are! Identify what you can do about it. In short, be open-minded. Given your specific constraints, try to find a good balance between learning more (being less wrong) and moving forward.
Use FrED iteratively, focusing each cycle on the weakest part of your analysis and adopting a Bayesian approach: update your thinking in light of new evidence. Don’t hesitate to change your mind if the evidence warrants it!
Even when there is no time to use the tools fully, following FrED can help you think in a more structured way; use it as a roadmap.