Numbers
2D devices
3D interaction techniques with, 20
2D surface-based interaction techniques
dragging, 280
3D bubble cursors, 270
3D display devices. See visual displays
3D graphics, 17
3D interaction. See interaction techniques
3D manipulation. See selection and manipulation
3D mice
overview of, 221
3D modeling tools, 22
3D printers, 235
3D sound sampling and synthesis, 154–155
3D spatial input devices
3D mice
overview of, 221
sensing technologies
bioelectric sensing, 211
hybrid sensing, 212
overview of, 200
radar sensing, 210
tracking technologies
head and hand tracking, 213
3D surface-based interaction techniques
pinching, 282
3D UIs. See also design
advantages and disadvantages of, 4–6
application areas for, 8–9, 23–25
user experience issues, 498–500
inventing
simulation of reality, 437–438
popular media background, 19
quantifying benefits of, 508–509
reciprocal impacts, 26
technological background, 17–19
A
absolute amplification, 304
absolute and relative mapping (ARM), 279–280, 304–305
acceleration selection, 323
of interaction fidelity components, 479
of movement, 322
speed-accuracy trade-off, 39
ACM Symposium on Spatial User Interaction, 14
actions
definition of, 91
selection and control of, 39–40
active omnidirectional treadmills, 336
active scaling, 358
active sensors, 189
active stereo glasses, 131–132
active travel techniques, 323
activities, 90
activity design, 116
activity theory
3D UIs and, 92
definition of, 90
Activity Theory in HCI (Kaptelinin and Nardi), 92
adaptation
aerial perspective, 43
affinity diagrams, 114
affordances, 89–90, 99–100, 264
age, design considerations for, 435–436
aggregation of techniques, 301
in-air haptics, 175
always-on AR (augmented reality), 508
ambient effects, 158
ambisonics, 155
amplification, absolute, 304
anaglyphic stereo, 132
analysis
analytic evaluations, 120
requirements analysis
contextual inquiry, 113
requirements extraction, 115
Analyze stage (UX engineering), 109–110
animated prototypes, 119
annotation of auditory displays, 158
anthropometric symmetry, 479
applications, future developments in, 509–511
application-specific tasks, 20, 257, 259
AR (augmented reality)
always-on AR, 508
AR systems, 19
definition of, 8
evaluation of, 506
mobile AR case study
selection and manipulation, 313–314
arbitrary surface displays, 148–150
pros and cons of, 174
visual depth cues supported, 174
architecture
3D UI applications in, 24
as inspiration, 439
arcs, 296
Argonne Remote Manipulator, 161
ARM (absolute and relative mapping), 279–280, 304–305
art, 3D UI applications in, 24
artifact models, 114
ARTookKit, 207
asymmetric bimanual techniques, 434–435
atmospheric attenuation, 43
auditory cues
HRTFs (head-related transfer functions), 48
reverberation, 48
sound intensity, 49
spatial percepts and, 49
vestibular cues, 49
auditory displays
3D sound sampling and synthesis, 154–155
ambient effects, 158
annotation and help, 158
auralization, 155
headphones, 156
localization, 157
overview of, 153
pros and cons of, 175
sensory substitution, 158
sonification, 157
augmented reality. See AR (augmented reality)
auralization, 155
automation
automated scaling, 359
automated velocity, 356
autostereoscopic displays
lenticular, 150
parallax barrier, 150
pros and cons of, 174
visual depth cues supported, 174
B
ballistic movement, 39
bare-hand input, 502
BCIs (brain-computer interfaces), 227–228
behavioral processing, 112
behaviors, 480
between-subjects design, 459
bicycles, 343
bimanual techniques
asymmetric techniques, 433, 434–435
bimanual action, 71
symmetric techniques, 298–299, 433, 435
bimechanical symmetry, 479
Binocular Omni-Orientation Monitor, 143–144
binocular rivalry, 45
bioelectric sensing, 211
body-referenced haptic displays, 162, 175
body-referenced menus, 393–394
bottlenecks, attention and, 37
brain-computer interfaces (BCIs), 227–228
Brave NUI World (Wigdor and Wixon), 280
breadth of prototypes, 118
bubble cursors, 270
Bug, 223
buttons, 240
C
camera-in-hand technique, 350
cameras
depth cameras, 205
multi-camera techniques, 360–361
canonical manipulation tasks, 257–259
case studies
mobile AR case study
selection and manipulation, 313–314
VR gaming case study
design approaches, 452
overview of, 28
selection and manipulation, 312–313
CAT (Control Action Table), 232
category knowledge, 58
CAVE (Cave Automatic Virtual Environment), 134
CavePainting Table, 231
cerebral palsy, input devices for children with, 233
ChairIO interface, 342
chemical sensing system, 53–54
choosing
input devices
empirical evaluations, 243–244
important considerations for, 238–239
input device taxonomies, 240–243
classification of techniques
manipulation techniques, 262–265
system control, 384
clay, modeling, 236
clearance design guideline, 100
closed-loop motor control, 39–40
cockpits, 343
codes of processing (information-processing pipeline), 63
cognition
cogitive issues, evaluation of
psycho-physiological methods, 66
cognitive affordances, 89
cognitive walkthroughs, 458
situation awareness
spatial knowledge types, 61
combination haptic devices, 168–169, 175
commands
definition of, 380
gestural commands
overview of, 398
practical application, 402–404
voice commands
design issues, 397
practical application, 397–399
speech recognition systems, 396–397
communication, human-computer. See HCI (human-computer interaction)
comparative evaluation, 459–460
compasses, 366
complementary input
component evaluation
component definition, 478
display fidelity components, 480–481
evaluation approaches
goals of, 483
interaction fidelity components, 479–480
results, applying, 478, 486–494
scenario fidelity components, 480
composite tasks, interaction techniques for, 20
Computer-driven Upper Body Environment (CUBE), 135
Computer-Supported Cooperative Work (CSCW), 93
conceptual models
designer’s model, 88
conditioning, 38
conductive cloth, 235
confirmation of irreversible actions, 102
constraints, 264
construction, 3D UI applications in, 24
context of evaluation, 468–470
contextual inquiry, 113
continuous velocity control, 355–356
contour interruption, 42
control, design principle of, 103–104
Control Action Table (CAT), 232
control dimensions, 260
control symmetry, 480
control-display mappings, 252
control-space techniques
levels-of-precision (LOP) cursors, 289–290
virtual interaction surfaces, 288–289
Cooper Harper Rating Scale, 65
cooperative manipulation, 433
corrective movement, 39
critiquing, 116
cross-task techniques, 345
CSCW (Computer-Supported Cooperative Work), 93
CUBE (Computer-driven Upper Body Environment), 135
Cubic Mouse, 223
cues
auditory
HRTFs (head-related transfer functions), 48
reverberation, 48
sound intensity, 49
spatial percepts and, 49
vestibular cues, 49
gustatory, 54
haptic, 52
kinesthetic and proprioceptive cues, 52
pain, 52
tactile cues, 51
thermal cues, 52
olfactory, 54
visual
monocular, static visual cues, 42–44
motion parallax, 45
cursors. See also selection and manipulation
3D bubble cursor, 270
levels-of-precision (LOP) cursors, 289–290
curvature of path, 322
curved surround-screen displays, 136–137
cutaneous sensations, 50
cybersickness, 57, 425, 462, 507–508
cycles, 343
D
DataGlove (VPL), 12
Davies, Char, 446
decoupling, 409
degrees of freedom (DOF), 6, 188, 322
deictic gestures, 400
delimiters, 402
demos, 460
dependent variables, 459
depth cameras, 205
depth cues, 129–130, 172–174, 498–500
depth of prototypes, 118
depth ray, 279
design
3D UI applications in, 23
basic principles of, 16
case studies
VR gaming case study, 452
design representations, 117–118
design scenarios, 117
future developments in, 500–505
HCI (human-computer interaction)
evaluation-oriented design, 104–106
execution-oriented design, 99–102
outcome-oriented design, 102–104
human-based, 423
feedback displacement, 425
feedback substitution, 428–429
passive haptic feedback, 429–431
unconventional user interfaces, 423–424
impact of evaluation on, 456
output device selection, 171–177
perspectives, 116
recommended reading, 454
selection and manipulation, 309–311
simulation of reality, 437–438
system control
multimodal techniques, 411
tools, 408
voice commands, 397
tools, 116
UX (user experience) engineering, 16
prototype evaluation, 120
requirements analysis, 112–115
system concept, 112
visualization, 16
design representations, 117–118
design scenarios, 117
Design stage (UX engineering), 109–110
designer’s model, 88
desktop 6-DOF input devices, 198–200
development
definition of, 91
tools, 22
device-referenced menus, 393–394
diagrams, affinity, 114
difficulty, index of, 39
direct manipulation, 99
direct velocity input, 356
direction, 323
directional compliance, 304, 426
pointing direction, 273
selection calculation, 273
discoverability, 402
discrete velocity changes, 355
displacement, feedback, 425
display devices. See visual displays
display fidelity components, 480–481
distance, travel and, 322
divided attention, 37
division of labor, 91
DIY (do it yourself) devices
connecting to computers, 236–237
strategies for building, 234–236
DOF (degrees of freedom), 6, 188, 322
dolls
definition of, 293
Double Bubble technique, 309
dragging, 280
dual-point world manipulation, 353
dual-target techniques, 346–347
dynamic alignment tools, 432
dynamic depth cues, 45
E
ease of use, 111
ecological perspectives, 116
education, 3D UI applications in, 24
EEG (electroencephalography), 66, 227
effectiveness, 111
efficiency, 111
egocentric information, 61
egocentric reference frames, 62
egomotion, 61
electroencephalography (EEG), 66, 227
electromyography (EMG), 74, 211, 218
Electronic Visualization Laboratory, 134
electrovibration tactile displays, 164–165
Embodied Interaction
definition of, 92
tangible computing, 93
embodied phenomena, 92
EMG (electromyography), 74, 211, 218
emotional impact, 112
emotional perspectives, 116
empirical evaluations, 120, 243–244
endogenous demands, 63
endurance time, 72
entertainment, 3D UI applications in, 24
environment characteristics, 474–475
environment legibility, 364–365
environment models, 114
environment-centered wayfinding cues, 364–367
Ergodesk, 443
ergonomics
design principles for, 100–101
evaluation of
performance measures, 74
psycho-physiological methods, 74
typical 3D UI issues, 73
feet and legs, 71
haptic displays, 160
musculoskeletal system, 67
sensory-motor distribution, 69
visual displays, 129
ERPs (event-related potentials), 37
errors
error recovery, 106
human error, 64
rate of, 111
reduction and correction, 409
Evaluate stage (UX engineering), 109–110
evaluation. See usability evaluation
evaluation-oriented design
error recovery, 106
event-related potentials (ERPs), 37
execution-oriented design
direct manipulation, 99
exocentric information, 61
exogenous demands, 63
Expand technique, 308
experimentation, evaluation and, 488–489
exploration, 320
external speakers, 156–157, 175
externalization, 91
extraction, requirements, 115
EyeRing, 225
F
factorial design, 459
family of rotations, 296
FastLSM algorithm, 269
fatigue, 73
feedback
design principles for, 104–105
displacement, 425
instrumental, 425
operational, 425
passive haptic feedback, 429–431
reactive, 425
feet, physical ergonomics of, 71
fidelity
definition of, 478
display fidelity components, 480–481
interaction fidelity components, 479–480
prototypes, 119
scenario fidelity components, 480
field of regard (FOR), 127–128, 480–481
field of view (FOV), 127–128, 362, 480–481
FIFA (Framework for Interaction Fidelity Analysis), 479
finger-based grasping techniques
rigid-body fingers, 268
soft-body fingers, 269
FingerSleeve, 224
fishing reel, 275
fish-tank virtual reality, 133
Fitt’s Law, 82
fixed-object manipulation, 351–352
flashlight technique, 276
flavor, 54
flexibility, 409
flexible pointer technique, 300–301
flicker, 129
Fly Mouse, 222
fMRI (functional magnetic resonance imaging), 227
fNIRS (functional near-infrared spectroscopy), 227
focused attention, 37
Foley, Jim, 12
force-feedback devices, 161–162
force-reflecting joysticks, 161
form factors
formative evaluations, 120, 458
FOV (field of view), 127–128, 362, 480–481
frames
Framework for Interaction Fidelity Analysis (FIFA), 479
Fraunhofer IMK Cubic Mouse, 223
freedom, degrees of, 6, 188, 322
front projection, 139
full gait techniques
overview of, 326
fully programmed prototypes, 119
functional affordances, 89
functional magnetic resonance imaging (fMRI), 227
functional near-infrared spectroscopy (fNIRS), 227
functional requirements, 115
future of 3D UIs
development and evaluation issues, 505–507
quantifying benefits of, 508–509
user experience issues, 498–500
G
gait
full gait techniques
overview of, 326
gait negation techniques
active omnidirectional treadmills, 336
low-friction surfaces, 336–337
overview of, 334
passive omnidirectional treadmills, 334–335
partial gait techniques
gait negation techniques
active omnidirectional treadmills, 336
low-friction surfaces, 336–337
overview of, 334
passive omnidirectional treadmills, 334–335
galvanic skin response, 66
3D UI applications in, 24
VR gaming case study
design approaches, 452
overview of, 28
selection and manipulation, 312–313
gaze-directed steering, 339–340
general design rules
geometrical coherence, 431
gestural commands
overview of, 398
practical application, 402–404
gestures
definition of, 398
deictic, 400
gestural commands
overview of, 398
practical application, 402–404
gesture-based interaction, 502
mimic, 400
speech-connected hand gestures, 400
surface-based, 400
sweeping, 400
symbolic, 400
Gibson, William, 439
global positioning systems (GPS), 212
goal-oriented design
structure, 97
visibility, 98
goal-oriented design
structure, 97
visibility, 98
Goals, Operators, Methods, and Selection (GOMS), 82–83
GOMS (Goals, Operators, Methods, and Selection), 82–83
Gorilla arm syndrome, 72
GPS (global positioning systems), 212
graphical menus
practical application, 396
grasping techniques
enhancements for
3D bubble cursor, 270
Hook, 272
intent-driven selection, 272
PRISM (Precise and Rapid Interaction through Scaled Manipulation), 271
finger-based
rigid-body fingers, 268
soft-body fingers, 269
hand-based
overview of, 264
grip design, 70
ground-referenced haptic displays, 161–162, 175
guided exploration, 416
guidelines-based expert evaluation, 458
gustatory cues, 54
H
habituation, 38
hand tracking, 213
hand-based grasping techniques
hand-directed steering, 340–341
handheld widgets, 391
hands-free 3D UIs, 504
haptic displays. See also haptic system
in 3D UIs, 169
body-referenced, 162
ergonomics, 160
future developments, 499
passive haptics, 169
perceptual dimensions, 159
resolution, 160
visual depth cues supported, 175–176
haptic system, 52. See also haptic displays
kinesthetic and proprioceptive cues, 52
pain, 52
tactile cues, 51
thermal cues, 52
HCI (human-computer interaction), 123–124
activity theory
3D UIs and, 92
definition of, 90
basic principles of, 16
conceptual models
designer’s model, 88
design principles
evaluation-oriented design, 104–106
execution-oriented design, 99–102
outcome-oriented design, 102–104
development as discipline, 80
Embodied Interaction
definition of, 92
tangible computing, 93
human processor models
GOMS (Goals, Operators, Methods, and Selection), 82–83
KLM (Keystroke-Level Model), 82
Touch-Level Model (TLM), 83
overview of, 78
recommended reading, 121
user action models
overview of, 84
User Action Framework (UAF), 86–87
UX (user experience) engineering
prototype evaluation, 120
requirements analysis, 112–115
system concept, 112
head-mounted displays (HMDs). See HWD (head-worn displays)
head-mounted projective displays (HMPDs), 144
head-referenced menus, 393–394
head-related transfer functions (HRTFs), 48, 154–155
head-worn displays. See HWD (head-worn displays)
hearing-impaired users, 106–107
heart rate assessment, 66
height relative to horizon, 42
Heilig, Morton, 166
help, 158
heritage and tourism applications, 23
heuristic evaluation, 458
hierarchical task analysis, 114
high-fidelity prototypes, 119
HMDs (head-mounted displays). See HWD (head-worn displays)
HMPDs (head-mounted projective displays), 144
Hook enhancement, 272
horizon, height relative to, 42
horizontal prototypes, 118
HRTFs (head-related transfer functions), 48, 154–155
human error, 64
human factors. See also human-based design
cognition
cogitive issues, evaluation of, 63–66
information processing
selection and control of action, 39–40
perception
chemical sensing system, 53–54
overview of, 41
perception issues, evaluation of, 56–58
sensory substitution, 55
physical ergonomics
ergonomics issues, evaluation of, 73–74
feet and legs, 71
musculoskeletal system, 67
sensory-motor distribution, 69
recommended reading, 76
Human Interface Technology Lab, 144
human joystick metaphor, 332–333
human processor models
GOMS (Goals, Operators, Methods, and Selection), 82–83
KLM (Keystroke-Level Model), 82
Touch-Level Model (TLM), 82–83
human-based design, 423
feedback displacement, 425
feedback substitution, 428–429
passive haptic feedback, 429–431
two-handed control
asymmetric techniques, 434–435
overview of, 432
symmetric techniques, 435
unconventional user interfaces, 423–424
human-computer interaction. See HCI (human-computer interaction)
HWD (head-worn displays)
Binocular Omni-Orientation Monitor, 143–144
HMPDs (head-mounted projective displays), 144
need for 3D user interfaces with, 4
optical see-through displays, 145–146
projector-based displays, 146
pros and cons of, 173
video see-through displays, 145
visual depth cues supported, 173
VRDs (virtual retinal displays), 144
hybrid haptic displays, 168–169, 175
hybrid interaction techniques, 21
aggregation of techniques, 301
HOMER technique, 302
technique integration, 301
hybrid sensing, 212
HYDROSYS system case study
selection and manipulation, 313–314
ID (index of difficulty), 39
ideation, 116
IEEE Symposium on 3D User Interfaces, 14
IHL (inside-the-head localization), 156
IID ( interaural intensity difference), 47
Illumroom, 149
image-plane pointing, 275
immersion, 480
Implement stage (UX engineering), 109–110
IMUs (inertial measurement units), 203, 215
independent variables, 459
index of difficulty (ID), 39
index of performance (IP), 40
indirect techniques
indirect control-space techniques
levels-of-precision (LOP) cursors, 289–290
virtual interaction surfaces, 288–289
indirect proxy techniques
world-in-miniature (WIM), 291–292
indirect widget techniques
inertial measurement units (IMUs), 203, 215
InForm, 164
information design, 116
information processing
information-processing pipeline, 63
selection and control of action, 39–40
informative feedback, 105
input, conditions of, 323
2D mice and trackballs, 194–196
3D mice
overview of, 221
case studies
choosing
empirical evaluations, 243–244
important considerations for, 238–239
input device taxonomies, 240–243
control dimensions, 260
definition of, 6
device placement and form factor, 261–262
DIY (do it yourself) devices
connecting to computers, 236–237
strategies for building, 234–236
evaluation of, 22
force versus position control, 260–261
integrated control, 260
pen- and touch-based tablets, 196–197
sensing technologies
bioelectric sensing, 211
hybrid sensing, 212
overview of, 200
radar sensing, 210
special-purpose devices, 228–234
tracking technologies
head and hand tracking, 213
input veracity, 479
inside-out approach, 202
inside-the-head localization (IHL), 156
instrumental feedback, 425
instruments, 91
integrated control, 260
intelligent constraints, 432
intensity of sound, 49
intent-driven selection, 272
interaction, human-computer. See HCI (human-computer interaction)
interaction design, 116
interaction fidelity components, 479–480
interaction perspectives, 116
interaction style, 381
interaction techniques, 4
control-display mappings, 252
definition of, 7
evaluation of, 22
hybrid, 21
multimodal interaction, 21
selection and manipulation
application-specific tasks, 259
canonical manipulation tasks, 257–259
multiple-object selection, 305–307
nonisomorphic 3D rotation, 303–305
progressive refinement, 307–309
recommended reading, 315
spatial rigid object manipulation, 257
surface-based interaction techniques, 280–286
system control
classification of techniques, 384
multimodal techniques, 409–411
transfer functions, 252
two-handed, 21
interactivity
interactive 3D graphics, 17
prototypes, 119
interaural intensity difference (IID), 47
interaural time difference (ITD), 47
interface requirements, 115
interface widgets, 4
interference filtering, 132
internalization, 91
interposition, 42
Intersection-based Spatial Interaction for Two Hands (iSith), 298–299
interviews, 460
inventing 3D UIs
simulation of reality, 437–438
IP (index of performance), 40
irreversible actions, confirming, 102
iSith (Intersection-based Spatial Interaction for Two Hands), 298–299
isometric muscle contraction, 67
isomorphic manipulation techniques, 262–263
isotonic muscle contraction, 67
ITD (interaural time difference), 47
iterative evaluation, 456
J-K
joysticks
human joystick metaphor, 332–333
isotonic, 197
Kay, Alan, 511
Keystroke-Level Model (KLM), 82
kinematic symmetry, 479
kinesthetic cues, 52
kinetic symmetry, 479
KLM (Keystroke-Level Model), 82
L
labor, division of, 91
landmark knowledge, 61
LCD panels, 135
lean-directed steering, 341–342
learnability, 111
legibility techniques, 364–365
Lego Interface Toolkit, 236
legs, ergonomics of, 71
lemniscal pathway, 50
lenticular displays, 150
levels-of-precision (LOP) cursors, 289–290
lifecycle of UX (user experience) engineering, 109–110
light transfer in visual displays, 129
linear perspective, 43
local prototypes, 118
localization, 157
locators, 240
locomotion techniques, 501–502
long-duration VR (virtual reality) sessions, 508
longitudinal evaluation, 506–507
LOP (levels-of-precision) cursors, 289–290
low-fidelity prototypes, 119
low-friction surfaces, 336–337
M
maneuvering, 321
manipulation. See selection and manipulation
manipulation-based travel metaphors
viewpoint manipulation techniques
fixed-object manipulation, 351–352
world manipulation techniques, 352–353
mapping
ARM (absolute and relative mapping), 279–280, 304–305
future developments in, 500
marker-based sensing systems, 207
markerless sensing systems, 207–208
marking points along paths, 348–349
massively multiplayer online role-playing games (MMORPGs), 93
mechanoreceptors, 50
mediation, 91
medicine, 3D UI applications in, 25
medium-fidelity prototypes, 119
memory
working, 59
MEMS (microelectronic mechanical systems), 204
mental load, 63
mental resources, control of, 409–410
menus
graphical menus
practical application, 396
bimanual
asymmetric bimanual techniques, 299–301
symmetric bimanual techniques, 298–299
grasping
finger-based grasping techniques, 267–270
hand-based grasping techniques, 264–267
overview of, 264
aggregation of techniques, 301
HOMER technique, 302
technique integration, 301
indirect
indirect control-space techniques, 287–291
indirect proxy techniques, 291–294
indirect widget techniques, 294–297
manipulation-based travel
viewpoint manipulation techniques, 349–352
world manipulation techniques, 352–353
pointing
overview of, 273
pointing direction, 273
selection calculation, 273
selection-based travel
route-planning techniques, 347–349
target-based techniques, 345–347
steering
physical steering props, 343–344
spatial steering techniques, 339–342
surface
surface-based 2D interaction techniques, 280–281
surface-based 3D interaction techniques, 282–286
walking
gait negation techniques, 334–338
partial gait techniques, 330–333
metrics
subjective response metrics, 461–462
system performance metrics, 461
task performance metrics, 461
for testbed evaluation, 475
mice
3D mice
overview of, 221
microelectronic mechanical systems (MEMS), 204
mimic gestures, 400
miniature keyboards, 191
mixed reality (MR), 8
MMORPGs (massively multiplayer online role-playing games), 93
mobile AR (augmented reality) case study
selection and manipulation, 313–314
mobility-impaired users, 106–107
mockups, 117
modalities (information-processing pipeline), 63
modeling clay, 236
models
3D UIs as, 510
activity theory
3D UIs and, 92
definition of, 90
artifact, 114
conceptual
designer’s model, 88
Embodied Interaction
definition of, 92
tangible computing, 93
environment, 114
human processor
GOMS (Goals, Operators, Methods, and Selection), 82–83
KLM (Keystroke-Level Model), 82
Touch-Level Model (TLM), 82–83
performance, 456
physical, 114
user, 113
user action
overview of, 84
User Action Framework (UAF), 86–87
monocular, static visual cues, 42–44
motion parallax, 45
movements, 39
movies as design inspiration, 439
MR (mixed reality), 8
MSVEs (multiscale virtual environments), 359
multi-camera techniques, 360–361
multimodal interaction, 21
multimodal techniques
design principles, 411
overview of, 409
practical application, 411
multiple dimensions in feedback, 424–425
multiple-object selection, 258, 305–307
multiscale virtual environments (MSVEs), 359
multi-sensory display systems, 499–500
multisensory output, 363
multisensory processing, 55–56
musculoskeletal system, 67
N
naïve search, 320
NASA Ames Research Center, 218
NASA TLX (task load index), 65
natural user interfaces, 398
Navidget, 360
navigation
case studies
recommended reading, 377
travel
active techniques, 323
classification of techniques, 323–325
combining with wayfinding, 367
definition of, 318
exploration, 320
gait negation techniques, 334–338
maneuvering, 321
multi-camera techniques, 360–361
nonphysical input, 361
partial gait techniques, 330–333
passive techniques, 323
physical, 323
physical steering props, 343–344
route-planning techniques, 347–349
scaling-and-traveling techniques, 358–359
semiautomated, 357
spatial steering techniques, 339–342
target-based techniques, 345–347
task characteristics, 322
velocity specification, 355–356
viewpoint manipulation techniques, 349–352
viewpoint orientation, 353–354
virtual, 323
world manipulation techniques, 352–353
wayfinding
combining with travel, 367
environment-centered cues, 364–367
overview of, 361
Neuromancer (Gibson), 439
nociceptors, 50
nonconventional system control, 381
nonisomorphic manipulation
nonisomorphic 3D rotation, 303–305, 354
nonphysical travel input, 361
novice users, designing for, 436
nulling compliance, 426
nulling correspondence, 304
O
object properties, 480
object-orientedness, 91
object-referenced menus, 393–394
objects
definition of, 90
reference objects, 367
occlusion, 42
olfactory cues, 54
omnidirectional treadmills
active, 336
open-ended interviews, 460
open-loop motor control, 39–40
operational feedback, 425
operations, 91
optical see-through displays, 145–146
orbital viewing, 354
orientation, viewpoint, 353–354
Osmose, 446
outcome-oriented design
outcomes, 91
output devices, 4
auditory displays
3D sound sampling and synthesis, 154–155
ambient effects, 158
annotation and help, 158
auralization, 155
headphones, 156
localization, 157
overview of, 153
pros and cons of, 175
sensory substitution, 158
sonification, 157
definition of, 7
haptic displays
in 3D UIs, 169
body-referenced, 162
ergonomics, 160
passive haptics, 169
perceptual dimensions, 159
resolution, 160
overview of, 126
visual displays
advances in, 17
arbitrary surface displays, 148–150
autostereoscopic displays, 150–153
depth cue effects, 129–130, 172–174
ergonomics, 129
FOR (field of regard), 127–128
HWD (head-worn displays), 141–148
light transfer, 129
refresh rate, 129
screen geometry, 128
single-screen displays, 131–134
spatial resolution, 128
surround-screen displays, 134–140
outside factors for testbed evaluation, 474–475
outside-in approach, 202
P
pain, 52
parallax barrier displays, 150
parameters of canonical tasks, 258
partial gait techniques
participatory design, 117
passive haptics, 169, 176, 429–431
passive omnidirectional treadmills, 334
passive sensors, 189
passive stereo glasses, 132
passive travel techniques, 323
Pavlov, Ivan, 38
pedal-driven devices, 343
pen-and-tablet technique, 441–443
PenguFly technique, 341
perception
auditory system
HRTFs (head-related transfer functions), 48
reverberation, 48
sound intensity, 49
spatial percepts and, 49
vestibular cues, 49
chemical sensing system, 53–54
overview of, 41
perception issues, evaluation of
performance measures, 57
psycho-physiological methods, 58
subjective measures, 57
sensory substitution, 55
somatosensory, 52
kinesthetic and proprioceptive cues, 52
pain, 52
tactile cues, 51
thermal cues, 52
vision
monocular, static visual cues, 42–44
motion parallax, 45
perceptual dimensions, 159
perceptual user interfaces, 398
performance
index of, 40
measures
of perception issues, 57
of physical ergonomics issues, 74
models, 456
requirements, 115
speed of, 111
perspective
aerial, 43
linear, 43
perspectives (design), 116
PET (positron emission tomography), 227
Phidgets, 238
photorealism, 445
physical affordances, 89
physical environment issues, 463–464
physical ergonomics
ergonomics issues, evaluation of
performance measures, 74
psycho-physiological methods, 74
typical 3D UI issues, 73
feet and legs, 71
musculoskeletal system, 67
sensory-motor distribution, 69
physical mockups, 117
physical models, 114
physical steering props, 343–344
physical travel, 323
physically realistic constraints, 431
pick devices, 240
PICTIVE, 117
pinching, 282
PIP (projected intersection point), 298–299
placement
plasticity of brain, 55
Pointer Orientation-based Resize Technique (PORT), 306–307
pointing techniques
absolute and relative mapping (ARM), 279–280
depth ray, 279
overview of, 273
pointing direction, 273
selection calculation, 273
vector-based
fishing reel, 275
image-plane pointing, 275
ray-casting, 274
volume-based
flashlight, 276
sphere-casting, 278
popular media, influence on 3D UI, 19
PORT (Pointer Orientation-based Resize Technique), 306–307
positioning, 258
positron emission tomography (PET), 227
Precise and Rapid Interaction through Scaled Manipulation (PRISM), 271
precision, 479
precision-grip devices, 261–262
presence, sense of, 4–5, 363, 462, 465
primed search, 320
PRISM (Precise and Rapid Interaction through Scaled Manipulation), 271
procedural knowledge, 61, 85–86
progressive refinement
Double Bubble technique, 309
Expand technique, 308
overview of, 307
projected intersection point (PIP), 298–299
projector-based displays, 146
proprioceptive cues, 52
prototypes, 118–119. See also DIY (do it yourself) devices
benefits and drawbacks of, 118
breadth of, 118
depth of, 118
evaluating, 120
fidelity of, 119
horizontal, 118
interactivity of, 119
local, 118
T prototypes, 118
vertical, 118
proxy techniques
world-in-miniature (WIM), 291–292
psychiatry, 3D UI applications in, 25
psycho-physiological methods
for cognitive issue evaluation, 66
for perception issue evaluation, 58
of physical ergonomics issues, 74
push-to-talk schemes, 226
“put-that-there” technique, 410
Q-R
quantifying 3D UI benefits, 508–509
questionnaires, 460
radar sensing, 210
radio frequency identification (RFID), 407
rapid evaluations, 120
rate of errors, 111
ray-based modeling, 155
ray-casting technique, 105, 274
reach design guideline, 100–101
reactive feedback, 425
real world
realism, 478
recall, 108
reciprocal impacts, 26
recommended reading
design approaches, 454
HCI (human-computer interaction), 121
human factors, 76
navigation, 377
selection and manipulation, 315
recovery (error), 106
reference objects, 367
referents, 400
reflective processing, 112
regard, field of, 127–128, 480–481
relative size, as visual cue, 42
repeatability, 479
representations, 58
design representations, 117–118
representation-based target techniques, 345–346
representative subsets of manipulation tasks, 257
requirement statements, 115
requirements, 115
requirements analysis
contextual inquiry, 113
requirements extraction, 115
requirements extraction, 115
research questions, 459
resolution
haptic displays, 160
spatial, 128
response
selection and control of action, 39–40
stimulus-response compatibility, 39
Responsive Workbench, 139
retainability, 111
reverberation, 48
RFID (radio frequency identification), 407
rigid-body fingers, 268
rigorous evaluations, 120
Ring Mouse, 224
robotics, 3D UI applications in, 25
Arcball, 296
family of rotations, 296
nonisomorphic 3D rotation, 354
nonisomorphic 3D rotation techniques, 303–305
route knowledge, 61
route-planning techniques, 347–349
S
SAGAT (Situation Awareness Global Assessment Technique), 65
Samsung Gear VR, 107
Santa Barbara Sense of Direction (SBSOD), 64–65
SARCOS Dextrous Arm Master, 161
satisfaction, 111
SBSOD (Santa Barbara Sense of Direction), 64–65
scaling-and-traveling techniques, 358–359
scenario fidelity components, 480
scenarios (design), 117
screen geometry, 128
scripted prototypes, 119
selection and manipulation, 258
application-specific tasks, 259
bimanual techniques
asymmetric bimanual techniques, 299–301
symmetric bimanual techniques, 298–299
canonical manipulation tasks, 257–259
case studies
graphical menus, 394
grasping techniques
overview of, 264
hybrid techniques
aggregation of techniques, 301
HOMER technique, 302
technique integration, 301
indirect techniques
indirect control-space techniques, 287–291
indirect proxy techniques, 291–294
indirect widget techniques, 294–297
input devices and, 259
control dimensions, 260
device placement and form factor, 261–262
force versus position control, 260–261
integrated control, 260
manipulation-based travel metaphors
viewpoint manipulation techniques, 349–352
world manipulation techniques, 352–353
multiple-object selection, 305–307
nonisomorphic 3D rotation techniques, 303–305
pointing techniques
overview of, 273
pointing direction, 273
selection calculation, 273
progressive refinement
Double Bubble technique, 309
Expand technique, 308
overview of, 307
recommended reading, 315
selection-based travel metaphors
route-planning techniques, 347–349
target-based techniques, 345–347
surface-based interaction techniques
target selection, 323
velocity/acceleration selection, 323
selection calculation, 273
selection volumes
defining, 306
selection-volume widget, 306–307
selection-based travel metaphors
route-planning techniques, 347–349
target-based techniques, 345–347
selection-volume widget, 306–307
selective attention, 37
semantics, 432
semiautomated travel, 357
sensing technologies. See also tracking technologies
bioelectric sensing, 211
hybrid sensing, 212
overview of, 200
radar sensing, 210
Sensorama, 166
sensors. See also tracking technologies
active, 189
passive, 189
radar, 210
sensory affordances, 89
sensory dimensions (feedback), 425
sensory-motor distribution, 69
sequential evaluation, 470–473
serial selection mode, 305
Seven Stages of Action, 477
shadows
and illusion of depth, 43
ShapeTag, 228
shutter glass synchronization, 138–139
sign language, 400
signs, 367
simulation of reality, 437–438
simulator sickness. See cybersickness
simulator systems
3D UI applications in, 24
overview of, 18
Simultaneous Localization and Mapping (SLAM), 208
single-object selection, 258
single-point world manipulation, 352
single-screen displays, 131–134
CAVE (Cave Automatic Virtual Environment), 134
curved surround-screen displays, 136–138
front projection, 139
pros and cons of, 138–139, 172
visual depth cues supported, 172
situation awareness
spatial knowledge types, 61
Situation Awareness Global Assessment Technique (SAGAT), 65
size
relative size, 42
SKETCH modeling system, 359, 443
sketching, 116
skills, decision-making and, 38–39
SLAM (Simultaneous Localization and Mapping), 208
smart 3D UIs, 503
SmartScene, 358
Snow Crash (Stephenson), 439
social 3D UIs, 510
social context, 113
soft keyboards, 193
soft-body fingers, 269
somatosensory system, 52
kinesthetic and proprioceptive cues, 52
pain, 52
tactile cues, 51
thermal cues, 52
sonification, 157
sound cues. See auditory cues
sound displays. See auditory displays
sound intensity, 49
spatial cognition. See cognition
spatial compliance, 426
spatial input devices
3D mice
overview of, 221
sensing technologies
bioelectric sensing, 211
hybrid sensing, 212
overview of, 200
radar sensing, 210
tracking technologies
head and hand tracking, 213
spatial knowledge types, 61
spatial percepts, 49
spatial rigid object manipulation, 257. See also selection and manipulation
spatial steering techniques, 339–342
speakers, external, 156–157, 175
special-purpose input devices, 228–234
spectral multiplexing, 132
speech recognition engines, 396
speech recognition systems, 396–397
speech-connected hand gestures, 400
speed of performance, 111
speed-accuracy trade-off, 39
sphere-casting, 278
Spindle + Wheel technique, 299–300
Spindle technique, 298
spinothamalic pathway, 50
stages of processing (information-processing pipeline), 63
stakeholders, 113
statements (requirement), 115
steering
gaze-directed steering, 339–340
hand-directed steering, 340–341
lean-directed steering, 341–342
physical steering props, 343–344
steering law, 40
torso-directed steering, 341
Stephenson, Neil, 439
stereo-based cameras, 205
stimulus-response compatibility, 39
storyboards, 117
strafe, 339
strategies, design. See design
strength design guideline, 101
strings, 240
strokes, 240
structure, design principles for, 97
structured interviews, 460
structured-light depth cameras, 205
subcutaneous sensations, 50
subjective measures
of perception issues, 57
of physical ergonomics issues, 73–74
subjective response metrics, 461–462
Subjective Workload Assessment Technique (SWAT), 65
subjects, 90
substitution
feedback substitution, 428–429
sensory substitution, 55
summative evaluations, 120, 459–460
surface friction tactile displays, 165
surface-based gestures, 400
surface-based interaction techniques
2D techniques
dragging, 280
3D techniques
pinching, 282
surround-screen displays
pros and cons of, 172
visual depth cues supported, 172
survey knowledge, 61
Sutherland, Ivan, 12
SWAT (Subjective Workload Assessment Technique), 65
sweeping gestures, 400
SWIFTER, 102
symbolic gestures, 400
symmetric bimanual techniques, 298–299, 435
synchronous tasks, 433
system characteristics, 474–475
system concept, 112
system control
case studies
classification of techniques, 384
gestural commands
overview of, 398
practical application, 402–404
practical application, 396
multimodal techniques
design principles, 411
overview of, 409
practical application, 411
tools
design issues, 408
practical application, 408–410
voice commands
design issues, 397
practical application, 397–399
speech recognition systems, 396–397
system performance metrics, 461
T
T prototypes, 118
pros and cons of, 173
visual depth cues supported, 173
tactile displays, 163–165, 175
tangible computing, 93
tangible user interfaces (TUIs), 93, 405–408
target selection, 323
target-based travel techniques, 345–347
Task Analysis/Workload scale, 65
task load index (TLX), 65
tasks
decomposition
parameters, 258
performance metrics, 461
task models, 114
task space, 258
travel tasks
exploration, 320
maneuvering, 321
task characteristics, 322
TAWL (Task Analysis/Workload scale), 65
taxonomies
input device taxonomies, 240–243
testbed evaluation, 474
technique integration, 301
HOMER technique, 302
technological background, 17–19
telerobotics, 8
testbed evaluation
examples of, 476
goals of, 483
overview of, 473
performance metrics, 475
results, applying, 475–476, 486–494
taxonomy, 474
texture gradient, 43
thermal cues, 52
thermoreceptors, 50
Three-Up, Labels In Palm (TULIP) technique, 388
time-of-flight depth cameras, 205
TLM (Touch-Level Model), 82–83
TLX (task load index), 65
tools
3D UIs as, 510
design tools, 116
dynamic alignment tools, 432
design issues, 408
practical application, 408–410
torso-directed steering, 341
Touch-Level Model (TLM), 82–83
tourism applications, 23
tracking technologies. See also sensing technologies
head and hand tracking, 213
traditional input devices
2D mice and trackballs, 194–196
desktop 6-DOF input devices, 198–200
pen- and touch-based tablets, 196–197
trails, 367
training, 3D UI applications in, 24
transfer function symmetry, 480
transfer functions, 252
travel
active techniques, 323
classification of techniques, 323–325
combining with wayfinding, 367
definition of, 318
exploration, 320
full gait techniques
overview of, 326
gait negation techniques
active omnidirectional treadmills, 336
low-friction surfaces, 336–337
overview of, 334
passive omnidirectional treadmills, 334–335
maneuvering, 321
multi-camera techniques, 360–361
nonphysical input, 361
partial gait techniques
passive techniques, 323
physical, 323
physical steering props, 343–344
recommended reading, 377
route-planning techniques, 347–349
scaling-and-traveling techniques, 358–359
semiautomated, 357
spatial steering techniques, 339–342
target-based techniques, 345–347
task characteristics, 322
velocity specification, 355–356
viewpoint manipulation techniques
fixed-object manipulation, 351–352
viewpoint orientation, 353–354
virtual, 323
world manipulation techniques, 352–353
treadmills
active omnidirectional treadmills, 336
low-friction surfaces, 336–337
passive omnidirectional treadmills, 334
true 3D displays, 498
TUIs (tangible user interfaces), 93, 405–408
TULIP (Three-Up, Labels In Palm) technique, 388
two-handed control, 21
asymmetric techniques, 434–435
overview of, 432
symmetric techniques, 435
two-points threshold test, 159
U
UAF (User Action Framework), 86–87
UbiComp (ubiquitous computing), 8
UI (user interface), 6
ultrasound-based in-air haptics, 166
unconventional user interfaces, 423–424
UniCam, 359
Uniport, 343
universal tasks, interaction techniques for, 20
usability
evaluating. See usability evaluation
improving, 111
usability evaluation
case studies
characteristics of, 463
evaluation type issues, 466–467
physical environment issues, 463–464
of cognitive issues
psycho-physiological methods, 66
empirical evaluations, 243–244
evaluation approaches
definition of, 457
sequential evaluation, 470–473
cognitive walkthroughs, 458
formative evaluations, 458
heuristic evaluation, 458
interviews and demos, 460
questionnaires, 460
summative evaluations, 459–460
evaluation type issues, 466–467
evaluation-oriented design
error recovery, 106
formal experimentation in, 488–489
iterative, 456
of perception issues
performance measures, 57
psycho-physiological methods, 58
subjective measures, 57
of physical ergonomics issues
performance measures, 74
psycho-physiological methods, 74
typical 3D UI issues, 73
prototypes, 120
subjective response metrics, 461–462
system performance metrics, 461
task performance metrics, 461
terminology for, 457
usability properties of 3D rotation mappings, 304–305
User Action Framework (UAF), 86–87
user action models
overview of, 84
User Action Framework (UAF), 86–87
user experience. See UX (user experience) engineering
user experience engineering. See UX (user experience) engineering
user groups, designing for, 435–436
user intent, 503
user interface (UI), 6
user models, 113
user-centered wayfinding cues, 361–364
user-preference scales, 57
User-System Loop, 477
uTrack, 225
UX (user experience) engineering, 7
evaluation, 7
system concept, 112
V
valuators, 240
variables
dependent, 459
independent, 459
VE (virtual environment), 8
vector-based pointing techniques
fishing reel, 275
image-plane pointing, 275
ray-casting, 274
velocity specification, 355–356
velocity/acceleration selection, 323
vertical prototypes, 118
vestibular cues, 49
vibrotactile displays, 163
video see-through displays, 145
view, field of, 127–128, 362, 480–481
viewpoint manipulation techniques
fixed-object manipulation, 351–352
viewpoint orientation, 353–354
virtual body, 363
virtual environment (VE), 8
virtual interaction surfaces, 288–289
Virtual Notepad, 441
virtual reality. See VR (virtual reality)
virtual retinal displays (VRDs), 144
Virtual Trackball techniques, 295–296
virtual travel, 323
Virtual Tricorder metaphor, 445
visceral processing, 112
vision
overview of, 41
vision-based sensor systems, 210
vision-impaired users, 106–107
visual cues
monocular, static visual cues, 42–44
motion parallax, 45
vision-impaired users, 106–107
visual channels (information-processing pipeline), 63
visual cues
monocular, static visual cues, 42–44
motion parallax, 45
visual data analysis, 24
visual displays
advances in, 17
autostereoscopic
lenticular displays, 150
parallax barrier displays, 150
depth cue effects, 129–130, 172–174
ergonomics, 129
FOR (field of regard), 127–128
HWD (head-worn displays)
Binocular Omni-Orientation Monitor, 143–144
HMPDs (head-mounted projective displays), 144
optical see-through displays, 145–146
projector-based displays, 146
video see-through displays, 145
VRDs (virtual retinal displays), 144
light transfer, 129
refresh rate, 129
screen geometry, 128
spatial resolution, 128
surround-screen
CAVE (Cave Automatic Virtual Environment), 134
curved surround-screen displays, 136–137
front projection, 139
visual sphere techniques, 354
visualization, 16
voice commands
design issues, 397
practical application, 397–399
speech recognition systems, 396–397
volume-based pointing techniques
flashlight, 276
sphere-casting, 278
volume-based selection techniques, 305–306
volumes (selection)
defining, 306
selection-volume widget, 306–307
Voodoo Dolls, 292–294, 444–445
vortex-based in-air tactile displays, 166–167
VPL DataGlove, 12
VR (virtual reality), 8
definition of, 8
long-duration VR sessions, 508
need for 3D user interfaces with, 4
overview of, 18
VR gaming case study
design approaches, 452
overview of, 28
selection and manipulation, 312–313
VR sickness, 462
VR gaming case study
design approaches, 452
overview of, 28
selection and manipulation, 312–313
VRDs (virtual retinal displays), 144
VRML specification, 25
W-X-Y-Z
W3C (World Wide Web Consortium), 25
walking metaphors
full gait techniques
overview of, 326
gait negation techniques
active omnidirectional treadmills, 336
low-friction surfaces, 336–337
overview of, 334
passive omnidirectional treadmills, 334–335
partial gait techniques
wave-based modeling, 155
wave-field synthesis, 155
combining with travel, 367
environment-centered cues, 364–367
overview of, 361
recommended reading, 377
Wheel lifecycle (UX engineering), 109–110
Where the Action Is: The Foundations of Embodied Interactions (Dourish), 92
whole-body interaction, 401
widgets
handheld, 391
indirect widget techniques
Navidget, 360
overview of, 20
Phidgets, 238
WIM (world-in-miniature), 291–292, 350
WIMP (Windows, Icons, Menus, and Pointers), 194–195
within-subjects design, 459
Wizard of Oz prototypes, 119
work activity notes, 114
work roles, 113
pros and cons of, 173
visual depth cues supported, 173
working memory, 59
world manipulation techniques, 352–353
World Wide Web Consortium (W3C), 25
world-grounded haptic devices, 161–162
world-in-miniature (WIM), 291–292, 350
world-referenced menus, 393–394
X3D, 25