Computing Reviews
Today's Issue Hot Topics Search Browse Recommended My Account Log In
Review Help
Search
Multimodal interaction with a wearable augmented reality system
Koelsch M., Bane R., Hoellerer T., Turk M. IEEE Computer Graphics and Applications26 (3):62-71,2006.Type:Article
Date Reviewed: Nov 29 2006

In augmented reality (AR) applications, the user experiences reality mixed with some kind of synthetic information, and in some cases, even has freedom of movement. This kind of application needs a natural way to interact that common input devices and desktop paradigms cannot offer. Multimodal interfaces usually combine inputs, not only from real input devices, but also from speech, hand gestures, and head direction, which can be called virtual input devices.

This paper shows the novice reader the possibilities of novel interfaces and AR applications, while the experienced reader will benefit from the valuable experience of the authors in a particular AR application that visualizes urban environments enriched with ubiquitous information.

The authors point out that “novel interaction metaphors must be developed together with user interfaces that are capable of controlling them.” In line with this, the paper presents different tools for interactive visualization, and the multimodal interfaces to control them. Between the designed tools, there are tunnels, which support the adding of layers of information in a focused region; paths, which support the visualization of how to reach one place from another; and virtual object manipulation, which supports the insertion and manipulation of simple geometric objects in the scene.

One of the main contributions of the paper is its analysis of the suitability of every virtual or real input device, according to the kind of command associated with it. The authors conclude that speech is appropriate for nonspatial commands, one-dimensional (1D) inputs can be done well with real mobile devices like wireless handheld trackballs, vision-based hand gesture recognition and gaze tracking are appropriate for two-dimensional (2D) manipulation, and both two-hand gesture recognition and a combination of one hand and a 1D real mobile device are appropriate for three-dimensional (3D) commands.

Reviewer:  Marma Abasolo Review #: CR133644 (0711-1143)
Bookmark and Share
 
Interaction Techniques (I.3.6 ... )
 
 
Artificial, Augmented, And Virtual Realities (H.5.1 ... )
 
 
Multimedia Information Systems (H.5.1 )
 
 
User Interfaces (H.5.2 )
 
Would you recommend this review?
yes
no
Other reviews under "Interaction Techniques": Date
Pushdown automata for user interface management
Dan R. J. ACM Transactions on Graphics (TOG) 3(3): 177-203, 1984. Type: Article
Apr 1 1986
A performing medium for working group graphics
Lakin F., Morgan Kaufmann Publishers Inc., San Francisco, CA, 1988. Type: Book (9780934613576)
Aug 1 1989
The next generation of interactive technologies
Frenkel K. Communications of the ACM 32(7): 872-881, 1989. Type: Article
Jan 1 1990
more...

E-Mail This Printer-Friendly
Send Your Comments
Contact Us
Reproduction in whole or in part without permission is prohibited.   Copyright 1999-2024 ThinkLoud®
Terms of Use
| Privacy Policy