User Interaction in 3D Space

One vision of virtual reality has us doing our shopping, filing our documents, and having many of our human interactions in interactive 3D spaces. To this end, a large number of devices exist on the market today such as gloves, wands, and body trackers, as well as other special devices such as head and eye trackers. We leave aside the idea of haptic or force feedback devices to focus on sensors or objects that are passive in nature. Some consideration of the problem of extending haptics and motion platforms capabilities to Java 3D through extension are covered in Chapter 13.

Picking

Picking, in particular, object picking, refers to the user selection of an object in the 3D scene. Such techniques allow the user to perform actions such as selecting 3D user interface elements that are in the environment or drag and drop objects in different locations (that is, arrange the environment). The trick in picking is to translate a mouse click on the 2D screen (or some other 2D device) into a position in the 3D world. This is accomplished by casting a ray from the user's eye position over the mouse click position and determining what objects that ray intersects the 3D world. We will cover picking in greater detail in Chapter 12.

Navigation

Obviously for any virtual environment application, navigation is going to be key. The psychologist Edward Tolman and his students at Berkley did much of the fundamental work in how rats build up cognitive maps of the environment through exploration. They showed convincingly that rats could use information about the environment to compute novel and optimal trajectories.

A fundamental question is whether we build robust cognitive maps of virtual environments, and the answer is pretty clear that in most cases we do not. Getting lost in virtual reality is commonplace, and this fact has serious implications for the use of virtual environments for things such as training and 3D experiential e-commerce.

In virtual spaces, navigation usually involves moving the viewing platform through the environment in a first-person fashion. We will explore different mechanisms for adding realism to VR-based navigation in Chapter 12.

Java 3D Sensors for External Devices

Java 3D provides an interface, the Sensor interface, that, along with the InputDevice interface, can be used to provide interaction with a variety of external devices. The major problem is that only a paucity of vendors have written implementations of these interfaces. If the Sensor to be used isn't one of the few that is supported, it is up to the programmer to develop a custom design.

Briefly, Java 3D communicates with a device driver through the InputDevice interface. If the input device is to be used by Java 3D, it must implement the InputDevice interface and make the object known to the PhysicalEnvironment object.

The Sensor interface contains the information about a real-time device such as a mouse, a headtracker, or a joystick.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset