Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift

Published: 01 March 2018 Publication History

Abstract

Bare hand interaction (BHI) allows users to use their hands and fingers to interact with digital content without any attached devices or accessories. For BHI to realize widespread adoption, interaction techniques for fundamental operations, like grasp-and-release, need to be identified and optimized. This paper presents a controlled usability evaluation of four common visual feedback techniques in grasp-and-release tasks using bare hand interaction (BHI). The techniques are `object coloring,' `connecting line,' `shadow' and `object halo.' The usability was examined in terms of task time, accuracy, errors and user satisfaction. A software test bed was developed for two interface configurations: using the Leap Motion controller alone (desktop configuration) and using the Leap with Oculus Rift (virtual reality (VR) configuration). Participants (n 32) performed four trials five feedback techniques two UI (user interface) configurations, i.e., a total of 1280 trials. The results can be summarized into: (a) user performance is significantly better in the VR configuration compared to the desktop; (b) coloring techniques for visual feedback (`object coloring' and `object halo') are more usable than `connecting line' regardless of UI; (c) in the VR, coloring techniques remain more usable, while in the desktop interface the `shadow' technique is also usable and preferred by users, (d) the `connecting line' technique often distracts users from grasp-and-release tasks on static targets. (e) Some visual feedback is always preferred by users than none in both VR and desktop. We discuss these findings in terms of design recommendations for bare hands interactions that involve grasp-and-release tasks.

References

[1]
Albert W, Tullis T (2013) Measuring the user experience: collecting, analyzing, and presenting usability metrics. Morgan Kaufmann.
[2]
Apostolellis P, Bortz B, Peng M, Polys N, Hoegh A (2014, March). Poster: exploring the integrality and separability of the Leap Motion Controller for direct manipulation 3D interaction. In: 3D User Interfaces (3DUI), IEEE Symposium on 2014. IEEE pp 153-154.
[3]
Bachmann D, Weichert F, Rinkenauer G (2014) Evaluation of the Leap Motion controller as a new contact-free pointing device. Sensors 15(1):214-233.
[4]
Beattie N, Horan B, McKenzie S (2015) Taking the LEAP with the Oculus HMD and CAD-Plucking at thin Air? Proced Technol 20:149-154.
[5]
Bowman DA, Johnson DB, Hodges LF (2001) Testbed evaluation of virtual environment interaction techniques. Presence 10(1):75-95.
[6]
Caggianese G, Gallo L, Neroni P (2016, June) An investigation of leap motion based 3D manipulation techniques for use in egocentric viewpoint. In: International conference on augmented reality, virtual reality and computer graphics, Springer, pp 318-330.
[7]
Codd-Downey R, Stuerzlinger W (2014) LeapLook: a freehand gestural travel technique using the Leap Motion finger tracker. In: Proceedings of the 2nd ACM symposium on Spatial user interaction, ACM, 2014, pp 153-153.
[8]
Coelho JC, Verbeek FJ (2014) Pointing task evaluation of Leap Motion controller in 3D virtual environment. In: CHI Sparks'14 Creating the Difference, pp 78-85.
[9]
England D (2011) Whole body interaction: an introduction. Whole body interaction. Springer, London, pp 1-5.
[10]
Song P, Goh WB, Hutama W, Fu, CW, Liu X (2012) A handle bar metaphor for virtual object manipulation with mid-air interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, 2012 pp 1297-1306.
[11]
Guna J, Jakus G, Pogac?nik M, Tomaz?ic? S, Sodnik J (2014) An analysis of the precision and reliability of the Leap Motion sensor and its suitability for static and dynamic tracking. Sensors 14(2):3702-3720.
[12]
Hu HH, Gooch AA, Thompson WB, Smits BE, Rieser JJ, Shirley P (2000) Visual cues for imminent object contact in realistic virtual environment. In: Proceedings of the conference on visualization'00. IEEE Computer Society Press, pp 179-185.
[13]
Jayakumar A, Mathew B, Uma N, Nedungadi P (2015). Interactive gesture based cataract surgery simulation. In: Proceedings of the 2015 fifth international conference on advances in computing and communications (ICACC). IEEE, pp 350-353.
[14]
Khademi M, Mousavi Hondori H, McKenzie A, Dodakian L, Lopes CV, Cramer SC (2014) Free-hand interaction with Leap Motion controller for stroke rehabilitation. In: Proceedings of the extended abstracts of the 32nd annual ACM conference on human factors in computing systems, ACM, pp 1663-1668.
[15]
Koutsabasis P, Domouzis C (2016) Mid-Air browsing and selection in image collections. In: International working conference on advanced visual interfaces (AVI) 2016, Bari (Italy), ACM, 2016, 7-10 June 2016.
[16]
Lin J, Yang W, Gao X, Liao M (2015) Learning to assemble building blocks with a Leap Motion controller. In: International conference on web-based learning, Springer, pp 258-263.
[17]
Marin G, Dominio F, Zanuttigh P (2014) Hand gesture recognition with Leap Motion and kinect devices. In: 2014 IEEE international conference on image processing (ICIP), IEEE, pp 1565-1569.
[18]
Nabiyouni M, Bireswar L, Bowman DA (2014) Poster: designing effective travel techniques with bare-hand interaction. In 3D user interfaces (3DUI), symposium on 2014 IEEE, IEEE, 2014, pp 139-140.
[19]
Parkin S (retrieved 16 March 2016) Oculus Rift: Thirty years after virtual-reality goggles and immersive virtual worlds made their debut, the technology finally seems poised for widespread use. https://www.technologyreview.com/s/526531/oculus-rift/.
[20]
Poupyrev I, Ichikawa T, Weghorst S, Billinghurst M (1998) Egocentric object manipulation in virtual environments: Empirical evaluation of interaction techniques. Comput Graph Forum 17(3):41-52.
[21]
Prachyabrued M, Borst CW (2016) Design and evaluation of visual interpenetration cues in virtual grasping. IEEE Trans Vis Comput Graph 22(6):1718-1731.
[22]
Renner RS, Velichkovsky BM, Helmert JR (2013) The perception of egocentric distances in virtual environments-a review. ACM Comput Surv (CSUR) 46(2):23.
[23]
Sauro J (2012) 10 things to know about confidence intervals. Available at: https://measuringu.com/ci-10things/. Accessed 2 May 2017.
[24]
Seixas M, Cardoso J, Dias MTG (2015) One hand or two hands? 2D selection tasks with the Leap Motion device. In: ACHI 2015: the eighth international conference on advances in computer-human interactions. IARIA, 2015, Lisbon, Portugal. 22-27 February 2015.
[25]
Sreng J, Lécuyer A, Mégard C, Andriot C (2006) Using visual cues of contact to improve interactive manipulation of virtual objects in industrial assembly/maintenance simulations. IEEE Trans Vis Comput Graph 12(5):1013-1020.
[26]
Teather RJ, Stuerzlinger W (2007) Guidelines for 3D positioning techniques. In: Proceedings of the 2007 conference on future play, ACM, pp 61-68.
[27]
Von Hardenberg C, Bérard F (2001, November) Bare-hand human-computer interaction. In: Proceedings of the 2001 workshop on perceptive user interfaces, ACM, pp 1-8).
[28]
Vosinakis S, Koutsabasis P, Makris D, Sagia E (2016) A kinesthetic approach to digital heritage using Leap Motion: the cycladic sculpture application. In: 8th international conference on games and virtual worlds for serious applications (VS-GAMES), 2016.
[29]
Weichert F, Bachmann D, Rudak B, Fisseler D (2013) Analysis of the accuracy and robustness of the Leap Motion controller. Sensors 13(5):6380-6393.

Cited By

View all
  • (2024)Evaluation of AR Pattern Guidance Methods for a Surface Cleaning TaskProceedings of the 30th ACM Symposium on Virtual Reality Software and Technology10.1145/3641825.3687730(1-11)Online publication date: 9-Oct-2024
  • (2024)Hands or Controllers? How Input Devices and Audio Impact Collaborative Virtual RealityProceedings of the 30th ACM Symposium on Virtual Reality Software and Technology10.1145/3641825.3687718(1-12)Online publication date: 9-Oct-2024
  • (2024)Exploring and Modeling Directional Effects on Steering Behavior in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345616630:11(7107-7117)Online publication date: 1-Nov-2024
  • Show More Cited By
  1. Evaluation of visual feedback techniques for virtual grasping with bare hands using Leap Motion and Oculus Rift

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Virtual Reality
    Virtual Reality  Volume 22, Issue 1
    March 2018
    85 pages
    ISSN:1359-4338
    EISSN:1434-9957
    Issue’s Table of Contents

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 01 March 2018

    Author Tags

    1. Bare hand interaction
    2. Leap Motion
    3. Oculus Rift
    4. Usability evaluation
    5. Virtual grasping
    6. Visual feedback techniques

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 10 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Evaluation of AR Pattern Guidance Methods for a Surface Cleaning TaskProceedings of the 30th ACM Symposium on Virtual Reality Software and Technology10.1145/3641825.3687730(1-11)Online publication date: 9-Oct-2024
    • (2024)Hands or Controllers? How Input Devices and Audio Impact Collaborative Virtual RealityProceedings of the 30th ACM Symposium on Virtual Reality Software and Technology10.1145/3641825.3687718(1-12)Online publication date: 9-Oct-2024
    • (2024)Exploring and Modeling Directional Effects on Steering Behavior in Virtual RealityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.345616630:11(7107-7117)Online publication date: 1-Nov-2024
    • (2024)Enhancing hand-object interactions in virtual reality for precision manual tasksVirtual Reality10.1007/s10055-024-01055-328:4Online publication date: 5-Nov-2024
    • (2024)Non-photorealistic rendering as a feedback strategy in virtual reality for rehabilitationVirtual Reality10.1007/s10055-024-00954-928:1Online publication date: 4-Mar-2024
    • (2022)Push, Tap, Dwell, and Pinch: Evaluation of Four Mid-air Selection Methods Augmented with Ultrasonic Haptic FeedbackProceedings of the ACM on Human-Computer Interaction10.1145/35677186:ISS(207-225)Online publication date: 14-Nov-2022
    • (2022)“Kapow!”: Studying the Design of Visual Feedback for Representing Contacts in Extended RealityProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565607(1-11)Online publication date: 29-Nov-2022
    • (2022)The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual EnvironmentsProceedings of Mensch und Computer 202210.1145/3543758.3543766(85-95)Online publication date: 4-Sep-2022
    • (2022)A haptic-feedback virtual reality system to improve the Box and Block Test (BBT) for upper extremity motor function assessmentVirtual Reality10.1007/s10055-022-00727-227:2(1199-1219)Online publication date: 7-Dec-2022
    • (2022)Usability, user experience and mental workload in a mobile Augmented Reality application for digital storytelling in cultural heritageVirtual Reality10.1007/s10055-022-00712-927:2(1117-1143)Online publication date: 15-Nov-2022
    • Show More Cited By

    View Options

    View options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media