Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3603421.3603422acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicvarsConference Proceedingsconference-collections
research-article

Integrating Mid-Air Gestures into an Extended Reality Application: A Methodological Approach

Published: 30 October 2023 Publication History

Abstract

Virtual reality (VR) is increasingly being incorporated into people's everyday lives as well as professional applications, science and research. Therefore, the question arises whether there may be interaction modes that are more suitable than the standard controllers with which these devices are typically equipped. To address this question, it is necessary to conduct human-centered studies, which are typically applied in the field of human factors research. To facilitate similar research approaches we present a framework to integrating mid-air gesture interaction into VR so that users can perform gestures without touching a button, a surface or a device into an HTC Vive VR Headset. For gesture recognition, we use a Leap Motion device, which is mounted at the front of the HTC Vive. We built a basic application using virtual 3D blocks that users can rotate and move. We used Unity to implement this application. This paper describes our methodological approach to building the setup.

References

[1]
Richard A. Bolt. 1980. "Put-That-There": Voice and gesture at the graphics interface, Vol. 14(3), 262–270.
[2]
Gamespot. Valve and HTC Reveal Vive VR Headset. Taiwanese smartphone manufacturer HTC will make the device. Retrieved May 29, 2022 from https://​www.gamespot.com​/​articles/​valve-and-htc-reveal-vive-vr-headset/​1100-6425606/​.
[3]
Gizmodo. This Is how Valve's amazing lighthouse tracking technology works. Retrieved May 29, 2022 from https://​gizmodo.com​/​this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768.
[4]
Shunzhan He, Chenguang Yang, Min Wang, Long Cheng, and Zedong Hu. 2017. Hand gesture recognition using MYO armband. In 2017 Chinese Automation Congress (CAC). IEEE, 4850–4855.
[5]
Hsin-Liu Kao, Artem Dementyev, Joseph A. Paradiso, and Chris Schmandt. 2015. NailO: Fingernails as an Input Surface. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. ACM, New York, NY, USA, 3015–3018.
[6]
Hoo Y. Leng, Noris M. Norowi, and Azrul H. Jantan. 2017. A user-defined gesture set for music interaction in immersive virtual environment. In Proceedings of the 3rd International Conference on Human-Computer Interaction and User Experience in Indonesia. ACM, New York, NY, USA, 44–51.
[7]
Sebastian Loehmann, Martin Knobel, Melanie Lamara, and Andreas Butz. 2013. Culturally independent gestures for in-car interactions. In Human-Computer Interaction – INTERACT 2013, David Hutchison, Takeo Kanade, Josef Kittler, Jon M. Kleinberg, Friedemann Mattern, John C. Mitchell, Moni Naor, Oscar Nierstrasz, C. Pandu Rangan, Bernhard Steffen, Madhu Sudan, Demetri Terzopoulos, Doug Tygar, Moshe Y. Vardi, Gerhard Weikum, Paula Kotzé, Gary Marsden, Gitte Lindgaard, Janet Wesson and Marco Winckler, Eds. Lecture Notes in Computer Science. Springer Berlin Heidelberg, Berlin, Heidelberg, 538–545.
[8]
Pradyumna Narayana, Nikhil Krishnaswamy, Isaac Wang, Rahul Bangar, Dhruva Patil, Gururaj Mulay, Kyeongmin Rim, Ross Beveridge, Jaime Ruiz, James Pustejovsky, and Bruce Draper. 2019. Cooperating with avatars through gesture, language and action. In Intelligent Systems and Applications, Kohei Arai, Supriya Kapoor and Rahul Bhatia, Eds. Advances in Intelligent Systems and Computing. Springer International Publishing, Cham, 272–293.
[9]
Kenton O'hara, Richard Harper, Helena Mentis, Abigail Sellen, and Alex Taylor. 2013. On the naturalness of touchless: putting the “interaction” back into NUI. ACM Transactions on Computer-Human Interaction 20, 1, 1–25.
[10]
Thies Pfeiffer. 2013. Documentation of gestures with data gloves. In Body - Language - Communication. An international handbook on multimodality in human interaction. Volume 1, Cornelia Müller, Alan Cienki, Ellen Fricke, Silva Ladewig, David McNeill and Sedinha Teßendorf, Eds. Handbücher zur Sprachund Kommunikationswissenschaft, Band 38.2. Walter de Gruyter GmbH, Berlin Germany, Boston Massachusetts, 868–878.
[11]
Thies Pfeiffer. 2013. Documentation of gestures with motion capture. In Body - Language - Communication. An international handbook on multimodality in human interaction. Volume 1, Cornelia Müller, Alan Cienki, Ellen Fricke, Silva Ladewig, David McNeill and Sedinha Teßendorf, Eds. Handbücher zur Sprachund Kommunikationswissenschaft, Band 38.2. Walter de Gruyter GmbH, Berlin Germany, Boston Massachusetts, 857–867.
[12]
Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). Extended Abstracts. 27 April - 2 May 2013, Paris, France, Wendy Mackay, Stephen Brewster, Susanne Bødker and Wendy E. Mackay, Eds. ACM, New York, 955.
[13]
Pocket-lint. HTC Vive Pro and Vive Pro 2 tips: How to set up the Vive Pro and tackle issues. Retrieved May 29, 2022 from https://​www.pocket-lint.com​/​ar-vr/​news/​htc/​144095-htc-vive-pro-tips-how-to-set-up-the-vive-pro-and-tackle-issues.
[14]
Prashan Premaratne, Ed. 2014. Human computer interaction using hand gestures. Cognitive science and technology. Springer-Verlag, New York.
[15]
James Pustejovksy, Nikhil Krishnaswamy, Ross Beveridge, Francisco R. Ortega, Dhruva Patil, Heting Wang, and David G. McNeely-White. 2020. Interpreting and generating gestures with embodied human computer interactions.
[16]
Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, and Roope Raisamo. 2021. Technologies for multimodal interaction in extended reality - A scoping review. MTI 5, 12, 81.
[17]
G. Rodriguez, N. Jofre, Y. Alvarado, J. Fernández, and R. Guerrero. 2017. Gestural interaction for virtual reality environments through data gloves. Adv. sci. technol. eng. syst. j. 2, 3, 284–290.
[18]
Dan Saffer. 2008. Designing gestural interfaces. O'Reilly Media, Inc., Sebastopol, CA.
[19]
Thomas Seeling, Frank Dittrich, and Angelika C. Bullinger. 2018. Digitales Gestenmanual zur Gestaltung einer natürlichen gestenbasierten Interaktion zwischen Mensch und Maschine. In Frühjahrskongress 2018.
[20]
Thomas Seeling, Ellen Fricke, Ulrike Lynn, Daniel Schöller, and Angelika C. Bullinger. 2016. Gestenbasierte Google-Earth-Bedienung: Implikationen für ein natürliches Gesten-Set am Beispiel einer 3D-Topographieanwendung. In Dokumentation des 62. Arbeitswissenschaftlichen Kongresses. Arbeit in komplexen Systemen - Digital, vernetzt, human?! GfA-Press, Dortmund.
[21]
The Verge. Valve's VR headset is called the Vive and it's made by HTC. Retrieved May 29, 2022 from https://​www.theverge.com​/​2015/​3/​1/​8127445/​htc-vive-valve-vr-headset.
[22]
Ultraleap. Leap Motion. Retrieved May 29, 2022 from https:// developer.leapmotion.com​/​.
[23]
Ultraleap. 2020. Datasheet (2020). Retrieved May 29, 2022 from https://​www.ultraleap.com​/​datasheets/​Leap_Motion_Controller_Datasheet_April_2020.pdf.
[24]
Daniel Wigdor and Dennis Wixon, Eds. 2011. Brave NUI world. Designing natural user interfaces for touch and gesture. Safari Tech Books Online. Morgan Kaufmann/Elsevier, Amsterdam.

Index Terms

  1. Integrating Mid-Air Gestures into an Extended Reality Application: A Methodological Approach
          Index terms have been assigned to the content through auto-classification.

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Other conferences
          ICVARS '23: Proceedings of the 2023 7th International Conference on Virtual and Augmented Reality Simulations
          March 2023
          133 pages
          ISBN:9781450397469
          DOI:10.1145/3603421
          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 30 October 2023

          Permissions

          Request permissions for this article.

          Check for updates

          Qualifiers

          • Research-article
          • Research
          • Refereed limited

          Conference

          ICVARS 2023

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • 0
            Total Citations
          • 50
            Total Downloads
          • Downloads (Last 12 months)27
          • Downloads (Last 6 weeks)1
          Reflects downloads up to 10 Feb 2025

          Other Metrics

          Citations

          View Options

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format.

          HTML Format

          Figures

          Tables

          Media

          Share

          Share

          Share this Publication link

          Share on social media