2 How Do We Form Ideas and Arrive at a Position?

Before we explore the process and skills involved in constructive debate, let’s spend some time thinking about how we develop and communicate our ideas. Although most of us are trained to value objectivity, much of our communication is actually based on our own subjective interpretation of events. We react to these interpretations, which we will call assumptions, by making value judgments; these judgments are often followed by an emotional response. All this gets communicated to others. Linguistics, the scientific study of language, has developed some useful concepts that can help us understand the structure of communication.

Worldview

Each of us has a unique way of looking at the world. This can be compared to a particular set of lenses or a map of the world to which we continuously refer. We develop our worldview through family, culture, language, education, profession, industry, and the many other influences to which we are exposed, especially in our formative years. A Swede sees snow differently from a Floridian. A geologist looks at rocks differently from a rock climber. A finance person looks at customers differently from the way a salesperson does.

Deep Structure: Generalizations, Deletions, and Distortions

As we observe the “objective” world, we view it through our own lenses or filters. Our everyday environment is like water to a fish— it’s just there; we don’t take note of it. Most of the time, we’re not particularly conscious of what we consider normal activities, since we already have a place for them on our mental map; they fall into familiar categories. We have a tendency, as linguists have shown, to generalize from what we know to what we don’t know—and either to distort or to delete (edit out) anything that doesn’t make sense, given that view. All snow may look alike to Floridians; their experience does not provide a “map” for differentiation, so differences in the type of snow are ignored. Swedes or Aleuts, on the other hand, have the worldview, including the language, to distinguish among many different kinds of snow. Deleting or distorting that information would cause them real inconvenience. In a famous “selective attention” experiment at Harvard University a number of years ago,3 researchers Christopher Chabris and Daniel Simon demonstrated that when subjects were asked to count basketball passes shown on a brief video, half of them didn’t notice the person in the gorilla suit walking past the players. When asked afterward if they would have noticed such a thing, most of them were sure that they would, but then felt surprised to discover that they had ignored it—in fact, had deleted it.

Generalization, deletion, and distortion are examples of what linguists call the “deep structure” of communication. It consists of the basic mental maps or models each of us has of reality. We develop these models over the course of our lives; they are almost always below our awareness, but they influence all our thinking and communication. We do not act directly on the world, in other words; we act on our model of the world. These models are both focusing and limiting. Generalization means that we assume that everything that is like our model in some characteristics is like it in all characteristics. Deletion means that we leave out of our perception things that don’t fit the model. Distortion means that we actually see or hear something different from “reality” in order to make it fit our model. In fact, the nature of objective reality has come under question in recent years from both ends of the political spectrum. We pick and choose (or create) the “facts” that best fit our beliefs or needs.

Facts, Assumptions, and Values

To communicate effectively, it’s important to be able to distinguish among facts, assumptions, and values. As we reflect on how we or others reach a conclusion, it’s useful to note the differences among these three types of statements or self-talk:

• Facts are data that can be objectively observed; for example, “The marketing manager has been meeting with the finance manager for over two hours.

• Assumptions are the meanings we assign to the observed facts; for example, “There must be a problem in funding our new project.

• Values are positive or negative beliefs or judgments in response to our assumptions; for example, “The marketing manager should have worked that issue out before announcing the new project.” Values are often expressed through opinions that are based on our assumptions or preferences, rather than on facts.

Managing Your Assumptions

We can never completely manage our assumptions; they are a part of us. It is said that “the eye cannot see itself”; so it is with assumptions. When we don’t distinguish among facts, assumptions, and the value judgments we make based on those assumptions, we operate as if our assumptions were the truth—that is, objective reality. This limits our ability to communicate with others, both in expressing our own ideas and in receiving information and ideas from them. This also limits the quality and variety of our ideas. Critical thinking is enhanced when we recognize that we operate with a worldview that leads us to make certain assumptions. When communicating about something significant, we can make our assumptions explicit to ourselves and others; this allows us to test them. When we are developing important ideas, we’re often communicating with people who hold at least somewhat different worldviews from our own. As a part of this process, it’s important to find common ground as well as to explore differences; this can only be accomplished if we’re willing to have an open discussion about the assumptions that each of us hold about the matter at hand.

How We Arrive at a Position

In most informal debates, participants enter with a position on the issue under consideration. They have arrived at their positions either through a process of reasoning or through adopting the position that best meets their underlying needs. It’s useful to deconstruct how these positions develop. The organizational psychologist Chris Argyris4 of Harvard University described the process of reasoning as occurring on a “ladder of inference”:

Observable data: At the bottom of the ladder of inference are all the data related to our topic of interest.

Selected data: We filter the data through our culture, values, needs, experience, language, or belief system, and then we select specific data to notice.

Interpretation: We assign meaning to our observations, placing them in a context that is familiar to us.

Assumptions: We make certain assumptions based on our interpretation of the data.

Conclusions: We draw conclusions from the assumptions. These conclusions may, over time, become fixed beliefs that then act as additional filters. Finally, we take actions based on the conclusions.

According to Argyris’s research, most of the time, we move up that ladder too quickly to take account of the steps along the way, thus causing us to confuse facts, values, and assumptions. We quickly integrate new information with our existing assumptions and may use the result to further justify our previous decisions or actions. The decisions we then make or the actions we then choose may bear only a slight relationship to all the relevant data available. Different people may select a different set of data, assign different meanings to it, and arrive at quite different conclusions.

In a constructive debate, we attempt to make this process visible both to ourselves and to others, thereby creating an opportunity to exchange information and ideas based on the same data, as well as to develop alternative ways of framing the information and to open up a variety of possible conclusions.

The same process occurs when a group or team is engaged in a debate or discussion. The team often allows the content of the discussion to stay at the top of the ladder, rather than “drilling down” to uncover the source of a suggestion or conclusion. The constructive debate skills we will be describing are based on the idea (really, an assumption!) that by making the thinking process more transparent and testable, the ideas that emerge will be more robust.

Avoiding Unconscious Bias and Other Thinking Errors

In recent years, we have become more aware of the role that unconscious bias plays in decision-making by both groups and individuals. We can think of these biases as errors in thinking. These errors usually involve confusion or lack of distinction among facts, assumptions, values, beliefs, or preferences. Certain thinking errors occur frequently in both business and personal decision-making. Unfortunately, they can be extremely costly to businesses and careers. Some aspects of organizational culture can even support these errors in thinking. Cultural values and norms can be very positive, but if leaders are not careful in interpreting them, they may work against strategic and critical thinking. For example:

• A bias toward alignment and teamwork can lead to “groupthink” and to avoidance of necessary and healthy conflict of ideas and principles.

• A focus on business results, especially short-term results, can lead to practices that optimize near-term gains but lead to unanticipated long-term problems or losses.

• An emphasis on strong leadership or a culture that is overly focused on specialist knowledge (while devaluing the broader knowledge of leaders who are generalists) can lead to a narrow focus on one person’s ideas, goals, and points of view.

Individuals may make or accept these errors in their business thinking for reasons that include personal advantage or gain, a desire to be accepted or respected, a wish to avoid conflict, a preference for answers that align with one’s own values or vested interests, or simply a preference for simple solutions that don’t require much effort. Here are a few of the most common errors:

Confirmation bias: a tendency to seek information that supports our expectations and to believe the information that supports our biases.

• Example: “The fact that our customers didn’t complain about the last batch proves that we have solved our quality issues.

Popularity bias: a tendency to believe that something is true because so many (or so many of the “right” people) believe it to be so. This can be part of a need for belonging to a particular social group or “tribe.”

• Example: “Everyone I know thinks we should not do this—who am I to disagree?”

Hasty generalizations: a tendency to base conclusions on a very small or unrepresentative sample.

• Example: “We tried that once, and it didn’t go over well—it just would never work here.”

Wishful thinking: a tendency to believe that something is true because we want it to be so or because it would serve our vested interests.

• Example: “We have the best salesforce in the country, so it won’t be a big problem if we are late to market.”

Rationalization: a tendency to start with the conclusion and seek evidence to support it (similar to confirmation bias and sometimes called the “sunk costs” fallacy).

• Example: “We have already made a big commitment to this approach; it would be too expensive to start over again. Besides, we’d look wishy-washy.”

Adversarial bias/ad hominem arguments: a tendency to dismiss or devalue the ideas of those we don’t like or respect or whom we see as adversaries, regardless of their ideas’ actual merits. This error attacks the person instead of focusing on the idea.

• Example: “Why would I believe anything Janet says? She always exaggerates.”

Circular reasoning (sometimes called begging the question):

assuming the truth of something that has not been proved and then using that “truth” as an argument to support one’s point.

• Example: “We are taking this action because it’s the right thing to do.”

The way to prevent these and other fallacies from becoming key elements in important business decisions is to structure meetings and decision processes so that both leaders and team members are encouraged to question assumptions, seek a variety of opinions, and support their proposals with clear logic and appropriate data. Most organizations have a version of the old Army saying: “We don’t have time to do it right, but we have time to do it over.” In these days of closer scrutiny of business decisions, we may not have the luxury of doing it over. Strategic decisions ought to be made in a thoughtful, unbiased, and ethical way.

To put convincing arguments forward, we need to examine our thinking objectively and critically, as uncomfortable as that may often be. Likewise, in building and maintaining an environment that supports and encourages thoughtful and constructive debate, leaders must encourage team members to dispute points based on both the quality of the data and the merit of the thinking behind the argument.

To hold a truly constructive debate, participants would do well to examine their own thinking objectively. Likewise, in creating and maintaining an environment that supports and encourages constructive debate, leaders ought to encourage their team members to dispute points based on the accuracy and relevance of the data as well as on the quality of the thinking behind the argument.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset