Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2207676.2208585acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

A handle bar metaphor for virtual object manipulation with mid-air interaction

Published: 05 May 2012 Publication History

Abstract

Commercial 3D scene acquisition systems such as the Microsoft Kinect sensor can reduce the cost barrier of realizing mid-air interaction. However, since it can only sense hand position but not hand orientation robustly, current mid-air interaction methods for 3D virtual object manipulation often require contextual and mode switching to perform translation, rotation, and scaling, thus preventing natural continuous gestural interactions. A novel handle bar metaphor is proposed as an effective visual control metaphor between the user's hand gestures and the corresponding virtual object manipulation operations. It mimics a familiar situation of handling objects that are skewered with a bimanual handle bar. The use of relative 3D motion of the two hands to design the mid-air interaction allows us to provide precise controllability despite the Kinect sensor's low image resolution. A comprehensive repertoire of 3D manipulation operations is proposed to manipulate single objects, perform fast constrained rotation, and pack/align multiple objects along a line. Three user studies were devised to demonstrate the efficacy and intuitiveness of the proposed interaction techniques on different virtual manipulation scenarios.

Supplementary Material

WMV File (paperfile131-3.wmv)
Supplemental video for “A handle bar metaphor for virtual object manipulation with mid-air interaction”

References

[1]
MIT Kinect Demo. www.ros.org/wiki/mit-ros-pkg/KinectDemos.
[2]
Kinect Demo Gallery, 2010. www.openni.org/gallery and openkinect.org/wiki/Gallery.
[3]
Q.-Z. Ang, B. Horan, Z. Najdovski, and S. Nahavandi. Grasping virtual objects with multi-point haptics. In IEEE VR, pages 189--190, 2011.
[4]
M. Annett, T. Grossman, D. Wigdor, and G. Fitzmaurice. Medusa: a proximity-aware multi-touch tabletop. UIST, pages 337--346, 2011.
[5]
T. Baudel and M. Beaudouin-Lafon. CHARADE: remote control of objects using free-hand gestures. ACM Communication, 36(7):28--35, July 1993.
[6]
H. Benko and A. D. Wilson. DepthTouch: Using depth-sensing camera to enable freehand interactions on and above the interactive surface. Technical report, 2009. Tech. Report MSR-TR-2009-23.
[7]
H. Benko and A. D. Wilson. Multi-point interactions with immersive omnidirectional visualizations in a dome. In ITS, pages 19--28, 2010.
[8]
F. Bettio, A. Giachetti, E. Gobbetti, F. Marton, and G. Pintore. A practical vision based approach to unencumbered direct spatial manipulation in virtual worlds. In Eurographics Italian Chapter Conference, pages 145--150, 2007.
[9]
D. A. Bowman, E. Kruijff, J. J. LaViola, and I. Poupyrev. 3D User Interfaces: Theory and Practice. Addison-Wesley, 2004.
[10]
P. Brandl, C. Forlines, D. Wigdor, M. Haller, and C. Shen. Combining and measuring the benefits of bimanual pen and direct-touch interaction on horizontal interfaces. In AVI, pages 154--161, 2008.
[11]
CEA LIST Interactive Physics Engine Demo, 2011. www.youtube.com/watch?v=7-HOc696g6s.
[12]
L. D. Cutler, B. Fröhlich, and P. Hanrahan. Two-handed direct manipulation on the responsive workbench. In I3D, pages 107--114, 1997.
[13]
T. Duval, A. Lécuyer, and S. Thomas. SkeweR: a 3D interaction technique for 2-user collaborative manipulation of objects in virtual environments. In 3D User Interfaces, pages 69--72, 2006.
[14]
W. Garage and the Stanford Artificial Intelligence Laboratory. Robot Operating System (ROS), 2009. www.ros.org/wiki/.
[15]
T. Grossman, D. Wigdor, and R. Balakrishnan. Multi-finger gestural interaction with 3D volumetric displays. In UIST, pages 61--70, 2004.
[16]
Y. Guiard. Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Jour. of Motor Behavior, 19(4):486--517, 1987.
[17]
G. Hackenberg, R. McCall, and W. Broll. Lightweight palm and finger tracking for real-time 3D gesture control. In IEEE VR, pages 19--26, 2011.
[18]
O. Hilliges, S. Izadi, A. D. Wilson, S. Hodges, A. Garcia-Mendoza, and A. Butz. Interactions in the air: Adding further depth to interactive tabletops. In UIST, pages 139--148, 2009.
[19]
K. Hinckley, R. Pausch, D. Proffitt, and N. F. Kassell. Two-handed virtual manipulation. ACM Trans. on Computer-Human Interaction, 5(3):260--302, 1998.
[20]
J. Jacobs and B. Froehlich. A soft hand model for physically-based manipulation of virtual objects. In IEEE VR, pages 11--18, 2011.
[21]
C. John, U. Schwanecke, and H. Regenbrecht. Real-time volumetric reconstruction and tracking of hands and face as a user interface for virtual environments. In IEEE VR, pages 241--242, 2009.
[22]
R. Jota and H. Benko. Constructing virtual 3D models with physical building blocks. In CHI Extended Abstracts, pages 2173--2178, 2011.
[23]
S. Kolarić, A. Raposo, and M. Gattass. Direct 3D manipulation using vision-based recognition of uninstrumented hands. In X Symposium on Virtual and Augmented Reality, pages 212--220, 2008.
[24]
J.-C. Lévesque, D. Laurendeau, and M. Mokhtari. Bimanual gestural interface for virtual environments. In IEEE VR, pages 223--224, 2011.
[25]
X. Luo and R. V. Kenyon. Scalable vision-based gesture interaction for cluster-driven high resolution display systems. In IEEE VR, pages 231--232, 2009.
[26]
Microsoft. Kinect for xbox360, 2010. www.xbox.com/en-US/kinect.
[27]
M. Nancel, J. Wagner, E. Pietriga, O. Chapuis, and W. Mackay. Mid-air pan-and-zoom on wall-sized displays. In CHI, pages 177--186, 2011.
[28]
R. G. O'Hagan, A. Zelinsky, and S. Rougeaux. Visual gesture interfaces for virtual environments. Interacting with Computers, 14(1):231--250, 2002.
[29]
PrimeSense, G. Willow, Side kick, and ASUS. OpenNI. www.openni.org.
[30]
Y. Sato, M. Saito, and H. Koike. Real-time input of 3D pose and gestures of a user's hand and its applications for HCI. In IEEE VR, pages 79--86, 2001.
[31]
J. Segen and S. Kumar. Gesture VR: vision-based 3D hand interface for spatial interaction. In ACM Multimedia, pages 455--464, 1998.
[32]
R. Y. Wang, S. Paris, and J. Popović. 6D hands: Markerless hand-tracking for computer aided design. In UIST, pages 549--558, 2011.
[33]
B. Yoo, J.-J. Han, C. Choi, K. Yi, S. Suh, D. Park, and C. Kim. 3D user interface combining gaze and hand gestures for large-scale display. In CHI, pages 3709--3714, 2010.
[34]
J. Zigelbaum, A. Browning, D. Leithinger, O. Bau, and H. Ishii. g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface. In Intl. Conf. on Tangible and Embedded Interaction, pages 261--264, 2010.

Cited By

View all
  • (2024)Entering the Next Dimension: A Review of 3D User Interfaces for Virtual RealityElectronics10.3390/electronics1303060013:3(600)Online publication date: 1-Feb-2024
  • (2024)Enable Natural User Interactions in Handheld Mobile Augmented Reality through Image ComputingProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3688902(1-2)Online publication date: 7-Oct-2024
  • (2024)Perception and Action Augmentation for Teleoperation Assistance in Freeform TelemanipulationACM Transactions on Human-Robot Interaction10.1145/364380413:1(1-40)Online publication date: 31-Jan-2024
  • Show More Cited By

Index Terms

  1. A handle bar metaphor for virtual object manipulation with mid-air interaction

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      CHI '12: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
      May 2012
      3276 pages
      ISBN:9781450310154
      DOI:10.1145/2207676
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 05 May 2012

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. 3d manipulation
      2. bimanual gestures
      3. user interaction

      Qualifiers

      • Research-article

      Conference

      CHI '12
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)146
      • Downloads (Last 6 weeks)11
      Reflects downloads up to 12 Sep 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Entering the Next Dimension: A Review of 3D User Interfaces for Virtual RealityElectronics10.3390/electronics1303060013:3(600)Online publication date: 1-Feb-2024
      • (2024)Enable Natural User Interactions in Handheld Mobile Augmented Reality through Image ComputingProceedings of the 2024 ACM Symposium on Spatial User Interaction10.1145/3677386.3688902(1-2)Online publication date: 7-Oct-2024
      • (2024)Perception and Action Augmentation for Teleoperation Assistance in Freeform TelemanipulationACM Transactions on Human-Robot Interaction10.1145/364380413:1(1-40)Online publication date: 31-Jan-2024
      • (2024)illumotion: An Optical-illusion-based VR Locomotion Technique for Long-Distance 3D Movement2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00111(924-934)Online publication date: 16-Mar-2024
      • (2024)The Benefits of Near-field Manipulation and Viewing to Distant Object Manipulation in VR2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00062(408-417)Online publication date: 16-Mar-2024
      • (2024)Object manipulation based on the head manipulation space in VRInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103346(103346)Online publication date: Aug-2024
      • (2024)Comparison of deviceless methods for distant object manipulation in mixed realityComputers & Graphics10.1016/j.cag.2024.103959122(103959)Online publication date: Aug-2024
      • (2024)Mixed interaction: evaluating user interactions for object manipulations in virtual spaceJournal on Multimodal User Interfaces10.1007/s12193-024-00431-2Online publication date: 22-May-2024
      • (2023)Interpretative Structural Modeling Analyzes the Hierarchical Relationship between Mid-Air Gestures and Interaction SatisfactionApplied Sciences10.3390/app1305312913:5(3129)Online publication date: 28-Feb-2023
      • (2023)Immersive and interactive visualization of 3D spatio-temporal data using a space time hypercube: Application to cell division and morphogenesis analysisFrontiers in Bioinformatics10.3389/fbinf.2023.9989913Online publication date: 8-Mar-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media