Chapter 2. The Structure of Multimodal Experiences

“Our personal worlds are constructions built by our brains using the raw materials of the senses—raw materials that are greatly modified during the construction process.”

—FAITH HICKMAN BRYNIE IN BRAIN SENSE (2009)

WHILE THE TITLE USER experience designer implies that designers create experiences for people, it might be more accurate to say that people construct experiences for themselves. As the saying goes, “Beauty is in the eye of the beholder.” And not just beauty, but functionality, clarity, and reliability. If you want to be very literal, they aren’t just in our eye. They are also in our skin, distributed across the surface of our bones, within our spines, and definitely a bunch of it is up in our heads.

Designers, magicians, and filmmakers tap into many aspects of this construction process to great effect. Our eyes can differentiate hundreds of frames per second, but we can extrapolate continuous movements at much lower speeds, which allows us to enjoy stop-motion animation and animated gifs. Many graphical user interfaces, or GUIs, use the metaphor of the desktop as a way to organize information (see Figure 2-1). This relies on a process known as assimilation, our ability to reuse mental models across new types of information. We have already learned how a physical desk works; this existing mental model helps us to understand how to use a computer’s operating system more easily. Another perceptual phenomenon, known as amodal completion, allows us to see objects as a whole, even if sections aren’t directly visible. This is used to create visual effects in screen-based navigations such as sliding panes and parallax. It’s also how the sawing-a-person-in-half trick works.

The Mac operating systems use visual metaphors
Figure 2-1. The Mac operating systems use visual metaphors

When we experience the physical world, we mostly need to figure out what is going on and what to do next. We are highly flexible and adaptive creatures, so it comes as no surprise that how we figure stuff out is also. We pull together our many different perceptual, cognitive, and physical abilities to come up with the appropriate response from one moment to the next. Our minds and bodies develop some shortcuts along the way to make it easier. Our product experiences are no different, and we figure out how to use devices and develop usage patterns and shortcuts in the same way.

The Human Slice of Reality: Umwelt and Sensibility

The way we experience the world is largely based on anticipating what we need, whether that anticipation is hardcoded into our body’s sensory mechanisms or softcoded by our own expectations built up by learning and doing. We all experience the world in different ways, and this diversity is even greater across species. Several types of snakes can sense heat in pits around their lips, which they use to zero in on prey while night hunting. Dolphins have twice as much hearing processing-power as humans and use it to locate objects underwater—something that most land animals can’t do there. A dog’s sense of smell is 100,000 times better than our own, and they use it to read emotions, which we can’t really do anymore. Like many devices, eels detect a range of the electromagnetic spectrum that most animals can’t. They use this ability to detect the proximity of other creatures. Each of these sensory abilities allow those that have them to access to a unique slice of physical information that shapes the way they interact with their environments.

The idea of a biological foundation for how we perceive the world as individuals and species is called umwelt, in theories articulated by Jakob von Uexküll and expanded by Thomas A. Sebeok and others. The word umwelt literally means “environment,” and Uexküll’s basic idea is that a being lives in a sensory world that reflects the things that can help it live and flourish.

Described by the psychologist James J. Gibson as affordances, the ability to identify usefulness in our environments was applied to design by Donald Norman, usability expert and author of The Design of Everyday Things (Basic Books). We perceive the different possibilities of interaction with objects and environments. But perceiving those interactions is also deeply tied to how we understand them. Most people rely on their sense of vision to navigate their surroundings. We can’t use hearing for navigation the way a dolphin does, or use smell to recognize each other in social interactions the way a dog does. Our umwelts define the range of interactions that are sensible by us, and in which senses we use for what purpose. These is true not just for humans, but for all living things (see Figure 2-2).

A slice of the umwelts of a human, dolphin, dog, eel, and iPhone; all have particular senses that inform their “worldview”
Figure 2-2. A slice of the umwelts of a human, dolphin, dog, eel, and iPhone; all have particular senses that inform their “worldview”

Assembling Multimodal Experiences: Schemas and Models

The mental model, also introduced to design by Donald Norman, should be familiar to product designers. It is the internal model that people build of an object to understand what it is and how it works. Many human capabilities are developed through patterns that emerge over repeated experiences. In psychology, the concept of schemas is used to describe patterns in thought or behavior. Models are used to describe internal representations of external objects and events, built through repeated experiences. There is some overlap between these two concepts and how they are applied. There are many types of schemas and models, and they are used across all aspects of human behavior. We have a body schema, a map of ourselves in space that allows us to walk around without bumping into things and is integral to hand–eye coordination. Modalities and multimodalities are considered types of schemas, which are patterns in how we use our perceptual, cognitive, and physical abilities together. So while we couldn’t use a keyboard without a mental model of the layout and behavior of the alphanumeric keys, we couldn’t type without our body schema either. We would not be able to use a word processing application if we were not able to integrate language, typing, and reading into a single activity. That is multimodality.

While we use schemas to organize and interpret our existing knowledge and behaviors, we create them to be able to handle new information and experiences effectively. They give rise to our expectations and skills, and they shape our ability to focus within an experience. Schemas are understood to be living structures—they evolve and expand with us over time. And like all working models, some become more permanent, validated over repeated experiences. Others continue to evolve, for experiences that are less common, new, or complex. Schemas and models emerge to make human experience manageable (see Figure 2-3). Design patterns mirror this aspect of human behavior, emerging to make interfaces manageable. Our senses at their best work invisibly, as we synthesize and systematize sensory data, and create automation in the form of skills and habits that we can perform without thinking about them. Good design, like our own senses, does the same.

Schemas affect how we perceive, recognize, and remember related things; design patterns follow mental models to make interaction consistent and manageable. If all power switches worked completely differently from each other, we would have to relearn them every time we used one.
Figure 2-3. Schemas affect how we perceive, recognize, and remember related things; design patterns follow mental models to make interaction consistent and manageable. If all power switches worked completely differently from each other, we would have to relearn them every time we used one.

The Building Blocks of Multimodal Experience

Sensing, understanding, deciding, and acting can be thought of as the building blocks to all our multimodal abilities. They are organized with schemas into cohesive bodies of knowledge and behaviors. In a very simplified example, combine language comprehension with vision, and you have the ability to read. Combine language with hearing, and you have listening. Combine language together again with hand movements, and you have writing or typing. Combine language with vocal cord and mouth movements, and you have speech. It also works for accessibility. Combining touch again with language in a different way allows for Braille reading and typing. Combine vision again with language, and it allows for sign language. Over time and practice, we develop patterns in how we use these different building blocks together and can use them more and more effortlessly (see Figure 2-4). These patterns are called modalities, and when we use multiple sets of senses together, they are called multimodalities.

Human communication spans multiple multimodalities: there are many different ways to communicate the same idea
Figure 2-4. Human communication spans multiple multimodalities: there are many different ways to communicate the same idea

Each of the building blocks of our abilities contributes in some way to how we develop our multimodalities, and they all strongly influence the design of products. Our senses delimit the kinds of physical information we can experience (see Figure 2-5). This is where sensibility guidelines like legibility and audibility come into play. Our eyes can only distinguish shapes up to a certain size and distance. Our ears can only hear within a certain range of volumes and pitches. Our cognitive abilities are gaining more attention as part of product design. Cognitive walkthroughs are a research practice that examine the roles of memory, language comprehension, and the ability and methods people use to assess different factors in decision making. They are gaining widespread usage in product design. Decision making is a special area of interest. There are many different models of decision making, because there are many different kinds of decisions we make—from accepting a friend request to how to get around a puddle. Certainly, our physical abilities have always informed design. All the important controls on a car dashboard must be within arm’s reach of the driver and usable with only one hand. Computer desks are now often adjustable in height to allow our elbows and wrists to be at comfortable angles to prevent repetitive stress injury.

Sense, understand, decide, and act can be considered the building blocks of experience
Figure 2-5. Sense, understand, decide, and act can be considered the building blocks of experience

Summary

Our minds and senses continually work together so that we are able to coherently experience the world. Psychology studies the ways we perceive what is happening, understand how it relates to us, decide which of the many options afforded by our context are right for us, and how we will proceed. All this is important information for designers. Sensing, understanding, deciding, and acting can be considered the building blocks of multimodal experiences. The chapters that follow will go deeper into them and explore how to use them to work together.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset