Chapter Two. Computing and Human Computer Interaction
B9780123786241500027/u02-01-9780123786241.jpg is missing
Computers were not intended to be vehicles for entertainment and content delivery. Computers were intended to compute. Highly specialized work required highly specialized machinery, and the computational ability of the machine that used vacuum tubes and punched cards was not thought to be of much use for anyone outside of a small circle of like-minded engineers. That circle, however, contained a number of great thinkers, scientists and generals: the military can be credited for advancing, from idea to production, many of the technical advances we take for granted today.

Understanding the history of human-computer interaction

The specialized nature of individual computational projects identified early computers, such as the UNIVAC in 1951 (which was originally intended to assist the United States government in completing the census) or the ENIAC I in 1946 (which was supported by the military to assist their development of artillery-firing tables). These computers were good at performing simple and discrete tasks, and were custom solutions to custom problems. A subtle shift occurred in 1952, however, when IBM announced the development of the 701:
“Our progress in electronics convinced us one year ago that we had in our company the ability to create for the Defense Department, and the defense industries, a computer of advanced design which could be of major service to our national defense effort.… We began planning and building such a machine, which we believe will be the most advanced, most flexible high-speed computer in the world. It is built not for one special purpose but as a general purpose device, and two days after it was announced on a limited confidential basis we had orders for ten… The new calculator takes less than one-quarter the space of the previous machine. It is difficult to compare speeds, but we feel conservatively that the new calculator is 25 times faster than our old one and far more flexible. In addition, the new machine is a commercial machine which will be rented and serviced with our regular line of products.”12 [Emphasis added.]
12It is interesting to note that, even in 1946, the general notion of Moore's law is present (the computational speed will increase exponentially over time) and the heralding of “faster and smaller” is being used to sell technology. < http://www-03.ibm.com/ibm/history/exhibits/701/701_announced.html>, courtesy of IBM Corporate Archives.
While still seemingly driven by a patriotic sense of duty, International Business Machines clearly had a more commercial motive, and this can be considered the launch of computing as a business tool—a tool intended for increased productivity across business tasks, and, ultimately, increased revenue. After the successful launch of the 701, IBM commenced with the rapid development of additional high end, room sized, expensive business machines. The machines got faster and smaller, and were viewed in business circles as the tools of automation necessary to make the enterprise run smoother and leaner. This remained true until a significant event occurred in 1968. This event was relatively unknown outside of computing culture, but it was an event that has so obviously shaped the heart of the information age.
In 1968, in the Convention Center in San Francisco, a group of over a thousand hackers13 listened and watched as a handsome young man quietly sat beneath an enormous display. The man had a soft, hypnotizing voice, and for nearly 90 minutes, he held the room of engineers captivated as he demonstrated one miraculous vision after another. At this conference, Doug Engelbart, a researcher at the Augmentation Research Center (ARC) at the Stanford Research Institute (SRI) in Menlo Park, California, presented a working system that highlighted—for the very first time, ever—windowed displays, a graphical user interface, networking, hyperlinks, audio and video “conferencing,” dynamic file linking, shared-screen collaboration and a mouse. “It is almost shocking to realize that in 1968 it was a novel experience to see someone use a computer to put words on a screen… Those who were in the audience at Civic Auditorium that afternoon remember how Doug's quiet voice managed to gently but irresistibly seize the attention of several thousand high-level hackers for nearly two hours, after which the audience did something rare in that particularly competitive and critical subculture—they gave Doug and his colleagues a standing ovation.”14
13Brian Harvey, a Professor of Computer Science at Berkeley, explains that a computer hacker “… is someone who lives and breathes computers, who knows all about computers, who can get a computer to do anything. Equally important, though, is the hacker's attitude. Computer programming must be a hobby, something done for fun, not out of a sense of duty or for the money.” < http://www.cs.berkeley.edu/~bh/hacker.html>
14Rheingold, Howard. Tools for Thought: The History and Future of Mind-Expanding Technology. MIT Press, 2000. p188. Also available online: < http://www.rheingold.com/texts/tft/9.html>
Engelbart publicly outlined a vision of computing as a truly human-centered tool, a tool that can be used to achieve great feats for the individual. While his work would not be found in a commercially available form until some years later, this little known event in 1968 can truly be thought of as the beginning of the “information age.”
The impact of Doug Engelbart's vision of computing may not have been realized immediately, yet his vision spread quietly as a number of his friends and students began to find their way into the worlds of academic research. Xerox PARC was the next major contributor to the world of computing and included a number of Doug's disciples. Xerox PARC can be thought of as the first workspace that formally embraced Interaction Designers. Xerox Corporation's mission for PARC, when it was officially founded in July of 1970, was to create “the architecture of information.”15 By 1973, the Xerox Alto was commercially available, and eleven years before the original Macintosh computer was released in 1984 the Alto included a What-You-See-Is-What-You-Get (WYSIWYG) editor, a mouse, a graphical user interface (GUI), bit-mapped display, menus, icons, windows, and Ethernet: the ability to communicate to a larger network.
But even PARC missed the beauty of its creation. The engineers at PARC failed to see that the computer could be used for something outside of the worlds of efficiency or productivity. The idea of one man, one computer was novel and unique, but did not transcend the then-established notion of a computer as a business tool—a fairly benign object intended to make transactions faster. It took a particularly savvy individual to grasp the potential for a human use of technology: Steve Jobs. 16 As Jobs toured PARC, he saw the future of computing. “And they showed me really three things. But I was so blinded by the first one I didn't even really see the other two. One of the things they showed me was object orienting programming… the other one they showed me was a networked computer system… they had over a hundred Alto computers all networked using email etc., etc. I didn't even see that. I was so blinded by the first thing they showed me which was the graphical user interface. I thought it was the best thing I'd ever seen in my life… within, you know, ten minutes it was obvious to me that all computers would work like this some day.”17
16Jobs did not, as it is commonly considered, “steal” the idea of a graphical user interface from Xerox. In fact, Apple negotiated a stock-for-visit trade with Xerox, and implicit in the visitation was the rights to use a number of the ideas that the visitors viewed as they toured PARC. It has been argued that Jobs is not a savvy businessman and that his success at Apple has been a fluke; this seems to indicate the contrary, and might be the business deal of the century.
17PBS: Triumph of the Nerds, Program Transcription. < http://www.pbs.org/nerds/part3.html> Text provided courtesy of Oregon Public Broadcasting.
Many digital designers consider Engelbart's work, and the extended development that followed at PARC, to be the birth of a new field of computing dedicated to the ambiguous “art” of crafting how people relate to machines. HCI, or Human Computer Interaction, has become the name for this field, and can be formally defined as the “discipline concerned with the design, evaluation and implementation of interactive computing systems for human use and with the study of major phenomena surrounding them.”18 After PARC was created, one can see how the momentum of human-centered computing built and built over two decades until the Macintosh was released in 1984. The Macintosh indicated a dramatic shift from the notion of computing as specialized work (computational) to computing as used in all work, and finally to computing used in the home. This path extends the field of HCI from dealing with primarily the implementation of computing systems toward the understanding of how people “interface” with technology.
18Card, Stu, et al. Curricula for Human-Computer Interaction. ACM SIGCHI, 1992/1996. < http://sigchi.org/cdg/cdg2.html#2_1> Association for Computing Machinery, Inc. Reprinted by permission.
In the field of HCI, a particular style of interface design quickly arose as the norm. This interface system included windows, icons, menus, and pointing (and clicking), and became known as WIMP. The interface style was found in the original Alto, in Apple's first major graphical user interface-driven computer the LISA, and also in the Apple Macintosh. WIMP has lived far longer than was ever expected, as it is the same paradigm found in modern-day Macintosh and Windows operating systems. Jef Raskin, one of the original designers credited with the Apple Macintosh operating system, was working diligently to develop an alternative to WIMP prior to his death in 2005. “The Mac is now a mess… One only cares about getting something done. Apple has forgotten this key concept.”19 While WIMP was once novel and unique, it appeared that Raskin became frustrated with the emphasis on aesthetics or graphics at the expense of usability.
19Walsh, Jason. “Talk time: Jef Raskin.” The Guardian. October 21, 2004. < http://technology.guardian.co.uk/online/story/0,3605,1331536,00.html>
With the development of WIMP came the general notion that computers could and should be used by the masses. The text-input command line was certainly enough to turn off non-technical individuals, and the direct-manipulation of overlapping windows—along with clever marketing from Apple—made the machine more accessible to most families. Much has been written about the two decades following the release of the Apple Macintosh computer, usually with an emphasis on the increasing capabilities of computers and the exponential growth both Apple and Microsoft enjoyed. While the majority of the historical texts attempts to understand the changes that occurred in the field of computers and computing, a broader look at the improvements of technology-driven products identifies an interesting growth of computer-like products that are called other things. Cellular phones, digital cameras, and other consumer electronics are, in fact, computers with different physical manifestations. Many practitioners in the field of HCI are beginning to consider the pragmatic implications technological advancement has on their profession. What if the “computer” as commonly understood is changed to another form—a form with different sizes and constraints, or a form without a screen?
B9780123786241500027/u02-02-9780123786241.jpg is missing

Cyborgs and the ubiquity of technology

The work of Steve Mann, formerly of MIT, illustrates one view of the technology-driven “computerless” future: Mann has dedicated over twenty years to investigating the nature of a cyborg, the science fiction driven vision of a half man, half machine. Mann refers to himself as Cyberman and wears a heads-up display embedded in his sunglasses. He also carries a hip-pack-style computer, which enables him to record and recall video, imagery, and other data during a casual conversation. 20 The following Mann has created is impressive, with a number of students at both MIT and the University of Toronto running around campus with dark glasses and wires streaming about their bodies. While the students are able to play a real-life version of the Matrix, these same young technologists have the very genius necessary to go on to manage the companies that are increasingly present in our lives. Microsoft, a particularly technologically-centered company, envisions a Connected Home—the lights turn on when you enter, and your thermostat adjusts to your particular preferences. This technology-centered view of the future that can be seen in the science fiction movies is decidedly unfamiliar. Practitioners involved in HCI struggle to make these cyborg-inspired tools easier to use, and struggle equally as hard to illustrate the implications of a “blue screen of death” on a thermostat. Underlying the development of these technologies seems to be a highly rhetorical but critically important question: Do people want these things in their house? Technology is the driving force behind these innovations, and humanity is left to cope as best as possible when these technical “advancements” reach the marketplace.
20McMullan, Erin. “Cyberman (2001).” Idea Idee: Digitaleve Canada's Webzine. < http://wearcam.org/cyberman_antithetical_relationship_of_art_mathematics_physics_technology.htm>
Recently, there has been a great deal of attention and effort placed on the creation of this type of smart device or applied computing. Academics and industry practitioners alike are investigating ways to embed computing in various locations around the home or even on the body. Many of these investigations are driven by engineering innovations, and while technically quite impressive, few engineers or product managers seem to be asking the difficult question of “why?” Why produce a refrigerator that knows when it is out of milk? Why create lighting systems that turn themselves on or off when a person enters or leaves the room? Those engaged in HCI activities—Interaction Designers—exist to ask these difficult questions, and to create frameworks for compelling experiences rather than technical experiences. Interaction Design has outgrown its computing roots, and is now a field responsible for humanizing technology.
The history of Interaction Design, then, is painted by a constant and rapid growth of technology and then a struggle to make that technology behave. The past twenty years show HCI professionals engaging in Usability Engineering, and Human Factors Engineers working with Industrial Designers to tame the complexity created by technological advancements. Deeply entrenched in companies and organizations are individuals who are advocating for humanity, rather than for technology. These individuals are slowly finding ways to transition toward the creation of designs that are not simply usable but are useful and desirable as well.
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset