Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Assessing hands-free interactions for VR using eye gaze and electromyography

Published: 01 June 2019 Publication History

Abstract

With the increasing popularity of virtual reality (VR) technologies, more efforts have been going into developing new input methods. While physical controllers are widely used, more novel techniques, such as eye tracking, are now commercially available. In our work, we investigate the use of physiological signals as input to enhance VR experiences. We present a system using gaze tracking and electromyography on a user's forearm to make selection tasks in virtual spaces more efficient. In a study with 16 participants, we compared five different input techniques using a Fitts' law task: Using gaze tracking for cursor movement in combination with forearm contractions for making selections was superior to using an HTC Vive controller, Xbox gamepad, dwelling time, and eye-gaze dwelling time. To explore application scenarios and collect qualitative feedback, we further developed and evaluated a game with our input technique. Our findings inform the design of applications that use eye-gaze tracking and forearm muscle movements for effective user input in VR.

References

[1]
Barry DT, Gordon KE, Hinton GG (1990) Acoustic and surface EMG diagnosis of pediatric muscle disease. Muscle Nerve 13(4):286-290.
[2]
Benko H, Saponas TS, Morris D, Tan D (2009) Enhancing input on and above the interactive surface with muscle sensing. In: Proceedings of the ACM international conference on interactive tabletops and surfaces--ITS '09, p 93. http://portal.acm.org/citation.cfm?doid=1731903.1731924.
[3]
Cardoso J (2016) Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In: Proceedings of the 22nd ACM conference on virtual reality software and technology. ACM, pp 319-320.
[4]
Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture : expressive, precise and targeted free-space interactions. In: Proceedings of the 2015 ACM on international conference on multimodal interaction (C), pp 131-138.
[5]
Chin CA, Barreto A, Cremades JG, Adjouadi M (2008) Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. J Rehabil Res Dev 45(1):161.
[6]
Cloudhead G (2015) VR Navigation. http://cloudheadgames.com/cloudhead/vr-navigation/.
[7]
Costanza E, Inverso SA, Allen R (2005) Toward subtle intimate interfaces for mobile devices using an EMG controller. In: Proceedings of the SIGCHI conference on Human factors in computing systems CHI 05:481. http://eprints.soton.ac.uk/270956/.
[8]
Darken RP, Cockayne WR, Carmein D (1997) The omni-directional treadmill: a locomotion device for virtual worlds. In: Proceedings of the 10th annual ACM symposium on User interface software and technology--UIST '97, pp 213-221. http://dl.acm.org/citation.cfm?id=263550.
[9]
Henze N, Rukzio E, Boll S (2011) 100,000,000 taps: analysis and improvement of touch performance in the large. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services. ACM, pp 133-142.
[10]
Hernandez Arieta A, Katoh R, Yokoi H, Wenwei Y (2006) Development of a multi-DOF electromyography prosthetic system using the adaptive joint mechanism. Appl Bionics Biomech 3(2):101-111.
[11]
Hernandez-Rebollar JL, Kyriakopoulos N, Lindeman RW (2002) The AcceleGlove: a whole-hand input device for virtual reality. In: ACM SIGGRAPH 2002 conference abstracts and applications. ACM, p 259.
[12]
ISO (1998) Ergonomic requirements for office work with visual display terminals (VDTs)--part 11: guidance on usability. ISO 9241-11:1998, p 22.
[13]
Jacob RJK (1990) What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, pp 11-18.
[14]
Jacobsen SC, Jerard RB (1974) Computational requirements for control of the Utah arm. In: Proceedings of the 1974 annual conference, vol 1. ACM, pp 149-155.
[15]
Jones BR, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 869-878.
[16]
Kiguchi K, Tanaka T, Fukuda T (2004) Neuro-fuzzy control of a robotic exoskeleton with EMG signals. IEEE Trans Fuzzy Syst 12(4):481-490.
[17]
Kim YR, Kim GJ (2017) HoVR-type: smartphone as a typing interface in VR using hovering. In: 2017 IEEE international conference on consumer electronics (ICCE). IEEE, pp 200-203.
[18]
Lee JS, Lee SE, Jang SY, Park KS (2005) A simplified hand gesture interface for spherical manipulation in virtual environments. In: Proceedings of the 2005 international conference on Augmented tele-existence. ACM, p 284.
[19]
MacKenzie IS, Soukoreff RW (2002) Text entry for mobile computing: models and methods, theory and practice. Hum Comput Interact 17(2-3):147-198.
[20]
Miniotas D (2000) Application of Fitts' law to eye gaze interaction. In: CHI '00 extended abstracts on Human factors in computer systems--CHI '00, pp 339-340. http://portal.acm.org/citation.cfm?doid=633292.633496.
[21]
Moseley JRJB, Jobe FW, Pink M, Perry J, Tibone J (1992) EMG analysis of the scapular muscles during a shoulder rehabilitation program. Am J Sports Med 20(2):128-134.
[22]
Myo (2013) Myo Gesture Control Armband. http://www.myo.com/techspecs.
[23]
Oculus (2015) The Rift's Recommended Spec, PC SDK 0.6 Released, and Mobile VR Jam Voting. https://www3.oculus.com/en-us/blog/the-rifts-recommended-spec-pc-sdk-0-6-released-and-mobile-vr-jam-voting/.
[24]
Pai YS, Tag B, Outram B, Vontin N, Sugiura K, Kunze K (2016) GazeSim: simulating foveated rendering using depth in eye gaze for VR. In: ACM SIGGRAPH 2016 Posters. ACM, p 75.
[25]
Ramcharitar A, Teather RJ (2017) A Fitts' law evaluation of video game controllers: thumbstick, touchpad and gyrosensor. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. ACM, New York, NY, CHI EA '17, pp 2860-2866.
[26]
Saponas TS, Tan DS, Morris D, Balakrishnan R, Turner J, Landay JA (2009) Enabling always-available input with muscle-computer interfaces. In: Proceedings of the 22nd annual ACM symposium on User interface software and technology, pp 167-176.
[27]
Saraiji Y (2016) PupilHMDCalibration. https://github.com/mrayy/PupilHMDCalibration.
[28]
Saraiji Y, Sugimoto S, Fernando CL, Minamizawa K, Tachi S (2016) Layered telepresence: simultaneous multi presence experience using eye gaze based perceptual awareness blending. In: ACM SIGGRAPH 2016 posters. ACM, p 20.
[29]
Slambekova D, Bailey R, Geigel J (2012) Gaze and gesture based object manipulation in virtual worlds. In: Proceedings of the 18th ACM symposium on virtual reality software and technology. ACM, pp 203-204.
[30]
Unity (2015) User interfaces for VR. https://unity 3d.com/learn/tutorials/topics/virtual-reality/user-interfaces-vr.
[31]
Ware C, Mikaelian HH (1986) An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bull 17(SI):183-188.
[32]
Williams B, Bailey S, Narasimham G, Li M, Bodenheimer B (2011) Evaluation of walking in place on a Wii balance board to explore a virtual environment. ACM Trans Appl Percept 8(3):1-14.
[33]
Wilson AD, Cutrell E (2005) Flowmouse: a computer vision-based pointing and gesture input device. In: Human-computer interaction-INTERACT 2005, vol 3585, pp 565-578. http://www.springerlink.com/index/DJ6AENMGBTYBC0EE.pdf.
[34]
Zander TO, Gaertner M, Kothe C, Vilimek R (2010) Combining eye gaze input with a brain? Computer interface for touchless human? Computer interaction. Int J Hum Comput Interact 27(1):38-51.
[35]
Zhang X, MacKenzie IS (2007) Evaluating eye tracking with ISO 9241--part 9. In: Proceedings of the 12th international conference on human-computer interaction: intelligent multimodal interaction environments, HCI'07. Springer, Berlin, pp 779-788. http://dl.acm.org/citation.cfm?id=1769590.1769678.

Cited By

View all
  • (2024)SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XRProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676384(1-19)Online publication date: 13-Oct-2024
  • (2024)Individualized foveated rendering with eye-tracking head-mounted displayVirtual Reality10.1007/s10055-023-00931-828:1Online publication date: 19-Jan-2024
  • (2023)Toward Optimized VR/AR Ergonomics: Modeling and Predicting User Neck Muscle ContractionACM SIGGRAPH 2023 Conference Proceedings10.1145/3588432.3591495(1-12)Online publication date: 23-Jul-2023
  • Show More Cited By

Index Terms

  1. Assessing hands-free interactions for VR using eye gaze and electromyography
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Virtual Reality
    Virtual Reality  Volume 23, Issue 2
    June 2019
    92 pages
    ISSN:1359-4338
    EISSN:1434-9957
    Issue’s Table of Contents

    Publisher

    Springer-Verlag

    Berlin, Heidelberg

    Publication History

    Published: 01 June 2019

    Author Tags

    1. Electromyography
    2. Eye gaze
    3. Physiological sensing
    4. Virtual reality

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 15 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)SonoHaptics: An Audio-Haptic Cursor for Gaze-Based Object Selection in XRProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676384(1-19)Online publication date: 13-Oct-2024
    • (2024)Individualized foveated rendering with eye-tracking head-mounted displayVirtual Reality10.1007/s10055-023-00931-828:1Online publication date: 19-Jan-2024
    • (2023)Toward Optimized VR/AR Ergonomics: Modeling and Predicting User Neck Muscle ContractionACM SIGGRAPH 2023 Conference Proceedings10.1145/3588432.3591495(1-12)Online publication date: 23-Jul-2023
    • (2023)The Effects of Body Location and Biosignal Feedback Modality on Performance and Workload Using Electromyography in Virtual RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580738(1-16)Online publication date: 19-Apr-2023
    • (2023)Leveling the Playing Field: A Comparative Reevaluation of Unmodified Eye Tracking as an Input and Interaction Modality for VRIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324705829:5(2269-2279)Online publication date: 1-May-2023
    • (2023)Exploring the user experience of hands-free VR interaction methods during a Fitts’ taskComputers and Graphics10.1016/j.cag.2023.10.005117:C(1-12)Online publication date: 1-Dec-2023
    • (2023)Virtual reality interaction based on visual attention and kinesthetic informationVirtual Reality10.1007/s10055-023-00801-327:3(2183-2193)Online publication date: 30-Apr-2023
    • (2023)Eye Tracking in Virtual Reality: a Broad Review of Applications and ChallengesVirtual Reality10.1007/s10055-022-00738-z27:2(1481-1505)Online publication date: 18-Jan-2023
    • (2023)The Impact of Usability and Learnability on Presence Factors in a VR Human Body NavigatorExtended Reality10.1007/978-3-031-43401-3_25(378-396)Online publication date: 6-Sep-2023
    • (2022)Evaluation of Hands-free Teleportation in VRProceedings of the 2022 ACM Symposium on Spatial User Interaction10.1145/3565970.3567683(1-6)Online publication date: 1-Dec-2022
    • Show More Cited By

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media