CHAPTER 3

What Is Complexity?

There is perhaps no concept more critical for the risk world to gain an appreciation and an understanding of than the difference between things that are simple, complicated, and complex. These are all types of “ systems” that systems thinkers use to categorize problems or issues or situations into. They are relatively easy to distinguish, and any experienced risk manager should be able to easily relate to the differences. The categorization though is not an academic exercise, as the different type of systems needs to be thought about and managed very differently.

There are two problems though. The first is that there is little knowledge about systems thinking, and thus very few risk managers are even aware that these different ways of thinking about systems exist. This lack of awareness is true on both an explicit knowledge level, and perhaps, more troubling there is not even an understanding on an intuitive basis. The second problem is that the risk management field has complicated thinking tools, and we all know the old adage about having a hammer and then acting as if everything is a nail.

As will be argued in this chapter, complicated thinking is rarely appropriate in that vast majority of important risk decisions. Making the problem worse, complicated thinking has the potential to make the solutions and actions devised cause much more harm than doing nothing. Indeed, it can be argued that the 2008 crisis was caused almost entirely by complicated thinking. Furthermore, many pundits, including me, believe that the continued insistence on complicated thinking is only going to make the next subsequent crisis not only more likely, but also more severe.

Many of the false axioms discussed in the previous chapter are the result of complicated thinking. Complicated thinking is convenient, and it is also, in a way, sexy. As the name implies, it is frequently complicated in the normal sense of the word, and thus, it lends credence to risk as a credible profession and specialty. Complicated thinking has its uses, and indeed, is a valuable and necessary skill for the risk manager to have—but only for those situations where it is applicable. Medicine is generally a complicated profession, but one does not want a heart surgeon to be running a sophisticated currency hedging program that utilizes exotic derivatives, just as you do not want a risk engineer to perform your heart bypass surgery.

It is time for the risk management profession to learn about systems, and more specifically, about complexity. The purpose of this chapter is to introduce systems thinking, illustrate the importance of it, and put complexity thinking at the forefront of risk management as it should be. Complexity explains a lot of different business and economic phenomena such as why certain videos go viral, the rise in importance of social media, the emergence of fads and even boom and bust cycles in the stock market. Complexity can also help us understand a lot of issues concerning risk. More importantly though than explaining things, complexity provides a new way of thinking about how to manage risk. A paradigm shift is needed in order to move away from the ever increasing dominance of complicated thinking to the more nuanced and appropriate complexity thinking.

Ecology for Bankers

In the 1970s, noted biologist Robert May started to warn the profession of biology that there was something else behind natural processes than the objective and reductionist rules that biologists were applying to advance their field as a discipline of scientific study. That phenomenon that Robert May was warning of was the nascent field of complexity theory, which itself was a subset of the larger systems way of thinking. The field of risk management should revisit Dr. May’s arguments, which proved to be invaluable for progress in understanding biological processes.

Dr. May, along with Simon Levin and George Sugihara, wrote a short, but very interesting article titled “Ecology for Bankers.”1 The article highlights the similarities between ecological systems and systemic risk in the financial system. Ecological systems are connected as are financial systems. Connected systems where each of the parts can adapt to the actions of other parts of the system are a type of system that is known as complex. A complex system reacts in very particular and unpredictable ways. It is directly opposed to what is known as a complicated system, which has parts that act independent of how other parts of the system act.

The predominant paradigm, particularly in risk management, is that the world is rational and reductionist. In more basic terms, this implies that things work according to a set of laws, and furthermore, that each of the pieces of a system or an issue can be looked at individually and then put together to form a logically consistent whole. That this dominant paradigm is particularly obvious if one examines the current academic research in risk management.

Studying phenomena with a completely rational and reductionist mindset is a clockwork view of the world. It assumes that outcomes are reproducible, in that if one conducts the exact same activities, then one will get the exact same results. However, risk management has several important differences from the workings of a clock. A clock is composed of springs, gears, and levers, which work in precise and predictable ways that are based on the well-known and well-tested laws of physics. In order to create an accurate watch that will last for decades, all one needs to do is put the appropriate pieces together in the correct sequence and in the correct proportions according to a set of blueprints. Risk management, meanwhile, has no mechanisms that work in precise and well-known ways. You cannot put together a successful risk management program that will last for decades by simply following a set of blueprints.

The clockwork view of the world is the basis of the scientific method that we all learned in primary school. Business, of course, also has its equivalent in scientific management that was first popularized by Frederick Taylor. The scientific management paradigm is, of course, the reason behind business schools and Masters of Business Administration programs. Scientific management is also the basis for most of the discipline of risk management. It is a rational, rules-based view of risk management, which assumes that risk management is a scientific discipline that operates much the same as physics does. It is a very desirable paradigm for several reasons, but it is also patently false, misleading, incomplete, and an arguably dangerous paradigm to be operating under.

Introduction to Systems Thinking

Systems theory is an evolving paradigm that is proving to be of great use in providing a different lens in which to examine issues in the management of natural systems, social systems, information management, and in general, business management itself. The qualities of systems thinking make it particularly appropriate for risk management.

The primary characteristic of systems thinking is that the assumption of reductionist thinking being the dominant paradigm should not be the automatic assumption. The ability to break a system down into its constituent parts and develop an understanding of the system simply by examining each of its parts, and then reconstituting the parts to create the whole is seen as an incorrect and misleading way of analyzing situations, unless it can be clearly demonstrated otherwise. Systems thinking involves examining the underlying processes that affect how a situation unfolds in a holistic fashion.

The three main types of systems are simple, complicated, and complex. Something that is simple is something that can be accomplished by following a few simple rules or, if you will, a recipe of sorts. It usually does not take any special training to be able to deal with. Furthermore, it is not particularly sensitive to how closely the rules or the recipe is followed. For instance, conducting the safety announcement on a plane is an example of a simple risk task.

A complicated system or task is a process that follows very well-defined and rigid rules or laws. It is also a process in which the results are reproducible in that if the exact same conditions exist, and the exact same steps are followed, then the exact same results will be produced. A complicated process is often compared to Newtonian physics in which there are well-known rules from which very precise outcomes, such as the movement of the planets, can be calculated with an amazing degree of precision. An example of a complicated risk process is dynamic hedging of a financial option by using the Black–Scholes option pricing formula. If one uses the same inputs, one will get the same hedge ratios of the offsetting position, and the same profit or loss from hedging the option if the underlying price process is the same.2

Simple and complicated systems share the feature that they are rules-based. The difference is in the robustness with which the rules apply. The presence of rules, even loosely applied rules, means that managing simple or complicated processes are relatively straightforward, once one knows the underlying rules or recipes. Underlying a simple or complicated process is a known—or at least conceptually knowable—process or set of processes. Additionally, simple and complicated systems can be analyzed in a reductionist fashion; their component parts can be broken down, studied and managed separately, and reassembled to construct the whole. This makes management of them a relatively straightforward task.

A key aspect of simple and complicated tasks is that they can be digitized, and thus managed by a computer or a robot. For instance, selfdriving vehicles, particularly on the highway, is an example. Self-driving vehicles are projected to be much safer and efficient than human-operated cars. As computing power increases, and as the mechanics of robots improve, many more tasks can be expected to be automated, which, in turn, increase the safety and efficacy of many tasks.

The roots of scientific management are rooted in the reducibility of complicated systems. Take, for instance, the assembly line that Frederick Taylor is credited with inventing and Henry Ford is credited with perfecting and proving the value of Taylor’s famous time–motion studies in which he broke down each single step of a task into its component parts, and then individually maximized each component was key to development of the modern factory. For instance, Taylor went so far as to measure the height and placement of the assembly line in relation to the worker to make the actions of the human assembly line worker as compact, time-efficient, and energy-efficient as possible. As an example, Taylor’s studies demonstrated that the ideal shovel size would allow a man shoveling to move exactly 22 pounds with each scoop of the shovel. This would lead to the most material being shoveled within a workday. Thus, the shovel head size was optimized to allow for 22 pounds of material. Different materials with different densities would necessitate different size shovel heads.

A complex system is fundamentally different. A complex system is one in which a phenomenon known as leaderless emergence is a central feature. In a complex system, there are no underlying rules or processes. Although patterns may be observable, they are not predictable, nor are they repeatable. Doing the exact same thing will not produce the exact same results. In fact, the results are just as likely to be totally opposite from previously.

Complexity arises whenever a situation involves a number of different agents that have the capability to adapt and change. In essence, complexity can arise whenever there is a group of people. A classic example of a complex system is the stock market. There is no leader of the stock market. Although there are many stock market pundits, none of them control whether the market rises or falls over any given period. Whether prices go up or down depends on the collective actions of the thousands of individual traders in a given stock. In turn, each of these traders will adjust their actions and their perception of the correct stock price based in part on the actions of other traders. It is a set of continually adapting actions that leads to the prices of stocks rising and falling. Stock market patterns are observable with hindsight, but they are not predictable. No one knows at any given point of time whether the market will rise or fall or remain relatively calm and unchanged. It is a dynamic system, and it is an incalculable system. It is complex.

Where Does Risk Management Lie on the Spectrum?

In my experience, almost every risk management problem of consequence falls into the realm of complexity. The problem is that almost every risk management response is a complicated one. Thus, there is a fundamental mismatch.

As will be discussed in the next chapter, major risk management issues almost always involve people, either as individuals, groups, societies, or even economies. Rarely is a major risk management issue solely a mechanical one. Even if it is a mechanical one, the response is generally straightforward—for instance, if a part in a machine breaks, then you fix or replace the part. If a formula in an algorithm is not doing its intended function, then you reprogram the algorithm. However, you cannot so easily replace or fix a person and you cannot reprogram a person or a society or an economy.

As discussed earlier, complexity arises when is are a set of agents that interact and that can adapt. The business world, and the global economy, is exactly that; a group of agents (customers, competitors, employees, vendors, investors, regulators, and so on) who consistently interact (sales transactions, advertising, social media, news feeds) and adapt (change strategies, develop new products, discover new wants and needs, change interests, change their opinions). Business and the economy are complex, and thus by extension, risk management is complex.

The possibility of bad or good things happening is not something that is designed by a complicated formula. If it was, I would hope that business leaders, economists, politicians, and central banks would have figured it out and programmed the economy to work in a more efficient, predictable, and benevolent manner. They have not simply because the world of business and the economy are complex entities. That is the central reason why economics is called the dismal science; it is not a complicated science at all!

Once you begin to think about it, and once one understands how complexity arises and the implications of complexity, it appears to be obvious that risk management is for the most part a profession of managing complexity. Despite the obvious, however, risk management is dominated by complicated thinking. There appears to be three main reasons for this: (1) an ignorance of complexity due to the relative newness as well as the novelty of the discipline, (2) a momentum built up from the engineering roots of risk management, and (3) an unwillingness to change a way of thinking based on arrogance and a false sense of confidence. Adding to this is the current age of the well-educated and highly degreed and certified expert—what I call the “white coat effect,” which is if someone is thought to be a scientist of any sort, then it is almost automatically assumed they will be an expert with readily available answers and solutions.

Outside of the social and the biological sciences, there is an overall ignorance about complexity. It is rarely discussed in business schools, nor is it discussed in most advanced courses in risk management, which are based on mathematical principles, (that is, the complicated-based principles), of data analysis. In my discussions with experienced risk managers about complexity, they immediately get the concept and understand its appeal and suitability as a working model for risk management. However, I have had many well educated, but less experienced risk managers who argue ferociously against a complex view and instead put forth their complicated ideas. Complexity is seen by them as a threat to the acceptance of risk management as a discipline worthy of respect.

This brings up the second issue of a momentum built up from the engineering roots of risk management. There are a lot of risk managers, and risk management organizations, who have a strong vested interest in keeping the status quo based on complicated thinking. It is extremely difficult to test someone’s ability to deal with complexity, but rather trivial to test someone’s knowledge of complicated things. Thus, schooling is based on complicated issues, certifications are based on complicated issues, and regulators institute processes based on complicated thinking.

With complexity management, you manage a situation, and you need to realize that it is neither possible nor reasonable to believe that you can solve a complex issue. This thinking of course goes directly against an engineer’s view of the world, where every issue is a problem waiting to be solved if only enough brainpower is applied to the problem. The “manage, not solve” paradigm of complexity management also requires a huge amount of humility. For any highly trained individual, such humility may be difficult, if not impossible, for them to live with.

This brings us to the third issue behind the dominance of complicated thinking even though complexity is the more appropriate mindset. Everyone has an innate need to believe that they are needed in their organization, and there is no reason to think that risk managers would be any different. Having a complexity mindset, where the answer to any complex question is generally of the form “maybe,” is not an answer that many managers have the self-esteem to feel comfortable giving. Risk managers want to believe that they are “Masters of the Universe” to borrow a phrase from author Tom Wolfe.3 They want to believe that through their skill, understanding, and intelligence that they are capable of controlling outcomes. The reality is quite different.

How to Manage Different Types of Problems?

The different type of problems—simple, complicated, or complex—obviously require different types of management techniques and tactics. In fact, attempting to manage one type of system, say a complex issue, with simple or complicated type tactics is likely to cause far more harm than good. At a minimum, using the wrong management tactic is likely to be inefficient and ineffective.

The simplest type of system to manage is not surprisingly a simple system. Because a simple process can be best managed with a recipe or a guide book, the obvious thing to do is to use the recipe or guidebook. To take a very common simple task, consider packing for a trip. I travel frequently on business, and to expedite the packing process, I have actually made up a business trip packing checklist. While it may seem like a very unnecessary thing to do, it is amazing how having a checklist both expedites the task of packing, as well as reduces the risk of you winding up in a city far from home and realizing you forgot cufflinks that you need for an early morning breakfast meeting. In fact, I once had a meeting with someone who had the hotel staple their cuffs together, and someone else who used paperclips as a substitute for their forgotten cufflinks.

A checklist provides a very simple, but also a very effective risk management tool. With my simple business packing checklist, it not only reduces the risk of forgetting something, but it also reduces the amount of thought and stress that needlessly goes into packing. There are none of those pauses where you ask “Have I packed everything?” as you stand there looking around your clothes closet or your office while scratching your head.

In a very interesting book, physician and author Atul Gawande outlines the advantages of using checklists and their usefulness in medical contexts.4 He gives clear evidence of how the use of simple checklists in the preparation for a surgical procedure dramatically increases the probability of success. It may seem so trivial that it is almost unnecessary, but the evidence is clear. Of course, medical procedures have a much greater level of risk and a much greater level of significance than my packing for a business trip.

Checklists are ideal for the risk management of simple systems and even complicated systems. However, caution should be used, as checklists tend to be very limiting when dealing with complex systems. A checklist is only appropriate if the task at hand is truly simple and routine. If there is the possibility for complexity to emerge, or for a nonroutine component to become a factor, then simply using a checklist may be leading to blindness to the possibility of nonroutine factors. For instance, checklists can lead one to a false sense of confidence, or lull one into a routine. If something nonroutine happens, it can be easily missed.

When dealing with a complicated system, the correct response is, of course, to engage an expert in the particular process or to follow the associated rules and laws. As they are completely defined and governed by laws or rules that produce completely reproducible results, complicated systems might actually be the easiest to risk manage—assuming, of course, that one knows the underlying laws that govern the system. Computers and robots are the ideal managers for complicated tasks, as they always produce the same results or outputs given the same inputs, and they respond more quickly and are less prone to error than humans. To reiterate, however, few systems or problems faced by organizations are completely complicated.

There are three main steps for managing complexity. They are: (1) recognize whether the issue at hand is simple, complicated, or complex, (2) think “manage, not solve,” and (3) engage a “try, learn, adapt” approach.5

The first step in managing complexity is to recognize that the system is complex. This, by itself, will help to prevent automatically resorting back into complicated thinking mode. The mere recognition of this fact will help, in that a more flexible approach will be utilized than usual.

The second step in dealing with complexity is to think in terms of “manage, not solve.” The phenomenon of emergence, which is a fundamental outcome of complexity, means that one has to be humble in one’s expectations. A search for a solution will prove futile. Complex systems are too unpredictable and involve too many moving parts for one to realistically believe in a set solution. Unlike with complicated systems in which the issue can be decomposed into its constituent parts and each of the parts managed in isolation, that is not possible with complexity.

“Manage, not solve” is very difficult conceptually for the risk management profession to follow. It goes against the engineering background of the field, it goes against the quantitative analysis bias of the profession, and it requires an almost inhuman amount of humility to admit that there is no human-based analytical solution available. It is also taxing and difficult to do. When you solve a problem, you are done with it. When you have to manage an issue, however, you can never let your guard down; you can never stop thinking and adapting to the evolving situation. While complicated issues can require a large amount of brainpower and energy to solve, they are in the long run far easier to deal with than complex issues, which are indefinitely ongoing.

“Manage, not solve” is also difficult to do, given the pressure that risk managers are under by regulators, boards, senior managers, and even the general public to provide answers and concrete solutions. The expectations of the various stakeholders of an organization are rarely realistic, given the presence of complexity. It creates a quandary for the risk manager that can only be partially addressed with education and communication of the issues surrounding complexity management. It requires a luxury of time, patience, and understanding that risk managers are rarely ever given.

The third component of managing complexity follows from the “manage, not solve” mindset. The third aspect is to adapt a “try, learn, adapt” mode of operation. The fundamental characteristic of complexity is emergence, which means that the risk manager must also practice emergence. The way, of course, to do this is to try things, see how the system reacts, learn from the reaction, and then adapt accordingly. Emergence comes from continual adaptation of the parts of the system. Emergence, thus, necessitates that the risk managers also continually adapt. It requires the risk manager to keep not only a keen eye, but also an open eye (and on open mind) on how the complex risk issue is evolving. “Try, learn, adapt” is active, not passive risk management. It involves the risk function becoming a living, breathing, and an in-the-moment function, just like the risk itself is. It is the antithesis of the analyze, plan, implement, and let the plan take care of things approach that so often is de facto utilized.

Recall that complexity arises from a group of agents who interact and can adapt and change. By its very nature then, complexity management requires a dynamic, rather than a static approach. With the “try, learn, adapt” approach, it is not so much the knowing of tactics that it is important, but the creativity of the tactics. You need to try things. You need the flexibility to actually make mistakes. You need the humility to realize that you cannot always know the answer a-priori, and furthermore, you may not know the answer ex-ante. You need an open mind to learn intuitively, rather than textbook learning, where the new idea becomes codified into a new process. After all, you must remember that complex systems are not reproducible like complicated systems. Thus, what worked one time is not likely to work the next time.

Risk management in the face of complexity necessitates that the tools of risk management need to be flexible and adaptable as well. Quarterly risk metrics or annual reports will not produce the timeliness needed. However, an organization that is itself steeped in the awareness of, and that has the willingness and the flexibility to adapt complexity risk management, will itself be an emerging “machine” that will likely naturally adapt as necessary to a variety of emerging risk management issues in the timely manner necessary.

VUCA

It seems natural to conclude this chapter with a brief introduction to the phrase VUCA. VUCA, or volatility, uncertainty, complexity, and ambiguity, is a natural way to extend the concept of complexity. As a concept, VUCA has been embraced in a variety of contexts, but perhaps nowhere more concretely and successfully as in the U.S. military.6

If anything, risk is all about volatility, uncertainty, complexity, and ambiguity. However, the problem is that conventionally risk management systems, and the training of risk managers, has been almost exclusively about the first two components, with an almost deliberate ignorance about the last two components of complexity and ambiguity.

VUCA, however, is a paradigm that fits quite well into the techniques of systems thinking. The volatility and ambiguity parts are essentially the basis of simple and complicated thinking. Complexity, however, has been relatively ignored for the reasons previously discussed. It is time for the risk management to adopt VUCA, and in particular, complexity concepts and accept the humility that doing so involves.

Concluding Thoughts

There is a final, and perhaps the most important, aspect to managing complexity and that is to embrace complexity rather than fear it. Complexity is a fact in business and it is a fact in risk management. However, complexity does not just pick on one company or one organization—although at times it may seem like that. Instead, complexity is like playing tennis in a strong wind. No one likes to play tennis in a strong wind as it upsets a player’s timing, their strokes, and a player’s game plan or strategy. However, the wind affects both players. It is the player who adjusts the best who wins, while the player who moans the most about the conditions and how they cannot use their normal game and strokes almost always loses. The risk manager who acknowledges complexity, and works with complexity, and even figures out a way to exploit complexity, is the risk manager who will prove their worth to their organization.

Philosopher Eric Hoffer once stated that, “in times of change, learners inherit the earth, while the learned find themselves beautifully equipped to deal with a world that no longer exists.” In risk management, our problems are becoming more complex, while the simple and complicated issues are becoming less important (or being managed by a “learned” computer or robot). It is time for risk managers to become learners of complexity.

 

1 Robert May, Simon Levin, and George Sugihara, “Ecology for Bankers,” Nature, Vol. 451, 21, 893–895, February 2008.

2 Ironically the principles behind the Black–Scholes Merton model for dynamic hedging were used by the hedge fund Long-Term Capital Management, which was perhaps better known as LTCM. LTCM, which had as two of its partners the Nobel Prize-winning economists Myron Scholes and Robert Merton suffered losses that led to its bankruptcy due to its strict adherence to the use of the formula. It is a striking example that risk management is not complicated.

3 Tom Wolfe, “Bonfire of the Vanities,” 1987, Farrer Strauss Giroux. The novel concerns a successful bond trader who believes that through his intelligence, he alone is responsible for his impressive success; that he is a “Master of the Universe.” Through a series of subsequent events, he learns to his chagrin that he conversely has little control over his life. It is a fictional example of how complexity lays waste to the best laid of plans and intentions.

4 Atul Gawande, “The Checklist Manifesto: How to Get Things Right,” 2009, Henry Holt and Company.

5 This three-step process for managing complexity is adapted from the book “It’s Not Complicated: The Art and Science of Complexity in Business,” by R. Nason. Forthcoming 2017, University of Toronto Press.

6 A more extensive introduction to VUCA can be found at R. Nason and E. Mare, “The Need for VUCA Management in Finance Education,” Technical Report UPWT 2015/24, Department of Mathematics and Applied Mathematics, University of Pretoria, 2015.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset