Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1450579.1450625acmconferencesArticle/Chapter ViewAbstractPublication PagesvrstConference Proceedingsconference-collections
research-article

Opportunistic controls: leveraging natural affordances as tangible user interfaces for augmented reality

Published: 27 October 2008 Publication History

Abstract

We present Opportunistic Controls, a class of user interaction techniques for augmented reality (AR) applications that support gesturing on, and receiving feedback from, otherwise unused affordances already present in the domain environment. Opportunistic Controls leverage characteristics of these affordances to provide passive haptics that ease gesture input, simplify gesture recognition, and provide tangible feedback to the user. 3D widgets are tightly coupled with affordances to provide visual feedback and hints about the functionality of the control. For example, a set of buttons is mapped to existing tactile features on domain objects. We describe examples of Opportunistic Controls that we have designed and implemented using optical marker tracking, combined with appearance-based gesture recognition. We present the results of a user study in which participants performed a simulated maintenance inspection of an aircraft engine using a set of virtual buttons implemented both as Opportunistic Controls and using simpler passive haptics. Opportunistic Controls allowed participants to complete their tasks significantly faster and were preferred over the baseline technique.

Supplementary Material

JPG File (file137-2.jpg)
AVI File (file137-2.avi)

References

[1]
Da Pam 738--751, 1992. Functional Users Manual for The Army Maintenance Management System - Aviation (TAMMS-A). Washington D.C: U.S. Army.
[2]
Blaskó, G. and Feiner, S. 2004. An interaction system for watch computers using tactile guidance and bidirectional segmented strokes. Proc. ISWC 2004. 120--123.
[3]
Bleser, G. and Stricker, D. 2008. Advanced tracking through efficient image processing and visual-inertial sensor fusion. Proc. IEEE Virtual Reality Conf. (VR '08), 137--144.
[4]
Brooks, F. P., Ouh-Young, M., Batter, J. J. and Kilpatrick, P. J. 1990. Project GROPE-Haptic displays for scientific visualization. Proc. 17th Annual Conf. on Comp. Graphics and Interactive Techniques, 177--185.
[5]
Buxton, W., Hill, R. and Rowley, P. 1985. Issues and techniques in touch-sensitive tablet input, Proc. 12th Annual Conf. on Comp. Graphics and Interactive Techniques, 215--224.
[6]
Conner, B. D., Snibbe, S. S., Herndon, K. P., Robbins, D. C., Zeleznik, R. C. and Dam, A. V. 1992. Three-Dimensional Widgets. Proc. 1992 Symp. on Interactive 3D Graphics, 183--188.
[7]
Fiala, M. L. 2005. ARTag, a fiducial marker system using digital techniques. Proc. 2005 IEEE Computer Society Conf. on Comp. Vision and Pattern Recognition (CVPR'05) - Volume 2, 590--596.
[8]
Fishkin, K. P. 2004. A taxonomy for and analysis of tangible interfaces, Personal Ubiquitous Computing 8, 5, 347--358.
[9]
Gibson, J. 1986. The Ecological Approach to Visual Perception. Hillsdale, N.J.: Lawrence Erlbaum Associates.
[10]
Hinckley, K., Pausch, R., Goble, J. C. and Kassell, N. F. 1994. Passive real-world interface props for neurosurgical visualization. Proc. SIGCHI Conf. on Human Factors in Computing Systems, 452--458.
[11]
Insko, B. E., Meehan, M. J., Whitton, M. C. and Frederick P. Brooks, J. 2001. Passive haptics significantly enhances virtual environments. Technical Report 0-493-17286-6. The University of North Carolina at Chapel Hill.
[12]
Ishii, H. and Ullmer, B. 1997. Tangible Bits: Towards seamless interfaces between people, bits and atoms. Proc. SIGCHI Conf. on Human Factors in Comp. Sys. 234--241.
[13]
Kjeldsen, R. and Kender, J. 1996. Finding skin in color images. Proc. 2nd International Conf. on Automatic Face and Gesture Recognition (FG '96), 312.
[14]
Klein, G. and Murray, D. 2007. Parallel tracking and mapping for small AR workspaces. Proc. International Symp. on Mixed and Augmented Reality (ISMAR'07).
[15]
Lindeman, R. W., Sibert, J. L. and Hahn, J. K. 1999. Hand-held windows: towards effective 2D interaction in immersive virtual environments. Proc. IEEE Virtual Reality Conference, 205--212.
[16]
Murray-Smith, R., Williamson, J., Hughes, S. and Quaade, T. 2008. Stane: Synthesized surfaces for tactile input. Proc. of the Twenty-sixth annual SIGCHI Conf. on Human Factors in Comp. Systems, 1299--1302.
[17]
Norman, D. 1988. The Psychology of Everyday Things. New York: Basic Books.
[18]
Roeber, H., Bacus, J. and Tomasi, C. 2003. Typing in thin air: The Canesta projection keyboard - a new method of interaction with electronic devices. CHI '03 Extended Abstracts on Human Factors in Computing Systems, 712--713.
[19]
3DV Systems, 2008. http://www.3dvsystems.com
[20]
Szalavari, Z. and Gervautz, M. 1997. The personal interaction panel - a two-handed interface for augmented reality Computer Graphics Forum 16, 3, 335--346.
[21]
Tomasi, C., Rafii, A. and Torunoglu, I. 2003. Full-size projection keyboard for handheld devices, Communications of the ACM 46, 7, 70--75.
[22]
Weimer, D. and Ganapathy, S. K. 1989. A synthetic visual environment with hand gesturing and voice input. Proc. SIGCHI Conf. on Human Factors in Comp. Systems, 235--240.

Cited By

View all
  • (2024)Becoming Q: Using Design Workshops to Explore Everyday Objects as Interaction Devices for an Augmented Reality Spy GameProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682096(1-11)Online publication date: 7-Oct-2024
  • (2024)Augmented Object Intelligence with XR-ObjectsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676379(1-15)Online publication date: 13-Oct-2024
  • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
VRST '08: Proceedings of the 2008 ACM symposium on Virtual reality software and technology
October 2008
288 pages
ISBN:9781595939517
DOI:10.1145/1450579
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 October 2008

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D interaction
  2. augmented reality
  3. selection metaphor
  4. tangible user interfaces

Qualifiers

  • Research-article

Funding Sources

Conference

VRST08

Acceptance Rates

VRST '08 Paper Acceptance Rate 12 of 68 submissions, 18%;
Overall Acceptance Rate 66 of 254 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)109
  • Downloads (Last 6 weeks)16
Reflects downloads up to 15 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Becoming Q: Using Design Workshops to Explore Everyday Objects as Interaction Devices for an Augmented Reality Spy GameProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3682096(1-11)Online publication date: 7-Oct-2024
  • (2024)Augmented Object Intelligence with XR-ObjectsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676379(1-15)Online publication date: 13-Oct-2024
  • (2024)GraspUI: Seamlessly Integrating Object-Centric Gestures within the Seven Phases of GraspingProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661551(1275-1289)Online publication date: 1-Jul-2024
  • (2024)TriPad: Touch Input in AR on Ordinary Surfaces with Hand Tracking OnlyProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642323(1-18)Online publication date: 11-May-2024
  • (2024)Make Interaction Situated: Designing User Acceptable Interaction for Situated Visualization in Public EnvironmentsProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642049(1-21)Online publication date: 11-May-2024
  • (2023)Improving Medical Simulation Using Virtual Reality Augmented by Haptic ProxyModern Development and Challenges in Virtual Reality10.5772/intechopen.108330Online publication date: 18-Oct-2023
  • (2023)Using Everyday Objects as Props for Virtual Objects in First Person Augmented Reality Games: An Elicitation StudyProceedings of the ACM on Human-Computer Interaction10.1145/36110527:CHI PLAY(856-875)Online publication date: 4-Oct-2023
  • (2023)Ubi-TOUCH: Ubiquitous Tangible Object Utilization through Consistent Hand-object interaction in Augmented RealityProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606793(1-18)Online publication date: 29-Oct-2023
  • (2023)Chandelier: Interaction Design With Surrounding Mid-Air Tangible InterfaceAdjunct Proceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586182.3616695(1-3)Online publication date: 29-Oct-2023
  • (2023)A Dataset and Machine Learning Approach to Classify and Augment Interface Elements of Household Appliances to Support People with Visual ImpairmentProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584038(77-90)Online publication date: 27-Mar-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media