Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3447527.3474875acmconferencesArticle/Chapter ViewAbstractPublication PagesmobilehciConference Proceedingsconference-collections
extended-abstract

Low-level Voice and Hand-Tracking Interaction Actions: Explorations with Let's Go There

Published: 27 September 2021 Publication History

Abstract

Hand-tracking allows users to engage with a virtual environment with their own hands, rather than the more traditional method of using accompanying controllers in order to operate the device they are using and interact with the virtual world. We seek to explore the range of low-level interaction actions and high-level interaction tasks and domains can be associated with the multimodal hand-tracking and voice input in VR. Thus, we created Let's Go There, which explores this joint-input method. So far, we have identified four low-level interaction actions which are exemplified by this demo: positioning oneself, positioning others, selection, and information assignment. We anticipate potential high-level interaction tasks and domains to include customer service training, social skills training, and cultural competency training (e.g. when interacting with older adults). Let's Go There, the system described in this paper, had been previously demonstrated at CUI 2020 and MobileHCI 2021. We have since updated our approach to its development to separate it into low- and high-level interactions. Thus, we believe there is value in bringing it to MobileHCI again to highlight these different types of interactions for further showcase and discussion.

References

[1]
Ghazanfar Ali, Hong-Quan Le, Junho Kim, Seung-Won Hwang, and Jae-In Hwang. 2019. Design of Seamless Multi-modal Interaction Framework for Intelligent Virtual Agents in Wearable Mixed Reality Environment. In Proceedings of the 32nd International Conference on Computer Animation and Social Agents (CASA ’19), 47–52. https://doi.org/10.1145/3328756.3328758
[2]
Mark Billinghurst. 2013. Hands and speech in space: multimodal interaction with augmented reality interfaces. In Proceedings of the 15th ACM on International conference on multimodal interaction (ICMI ’13), 379–380. https://doi.org/10.1145/2522848.2532202
[3]
Richard A. Bolt. 1980. “Put-that-there”: Voice and gesture at the graphics interface. In Proceedings of the 7th annual conference on Computer graphics and interactive techniques (SIGGRAPH ’80), 262–270. https://doi.org/10.1145/800250.807503
[4]
Andrea Ferracani, Marco Faustino, Gabriele Xavier Giannini, Lea Landucci, and Alberto Del Bimbo. 2017. Natural Experiences in Museums through Virtual Reality and Voice Commands. In Proceedings of the 25th ACM international conference on Multimedia (MM ’17), 1233–1234. https://doi.org/10.1145/3123266.3127916
[5]
David M. Krum, Olugbenga Omoteso, William Ribarsky, Thad Starner, and Larry F. Hodges. 2002. Speech and gesture multimodal control of a whole Earth 3D visualization environment. In Proceedings of the symposium on Data Visualisation 2002 (VISSYM ’02), 195–200.
[6]
Dmitry Kutcenko. 2019. RPG/FPS Game Assets for PC/Mobile (Industrial Set v2.0). Retrieved from https://assetstore.unity.com/ packages/3d/environments/industrial/rpg-fps-game-assets-for-pc-mobile-industrial-set-v2-0-86679
[7]
Oculus Blog. 2019. Introducing Hand-tracking on Oculus Quest—Bringing Your Real Hands into VR. Retrieved from https://www.oculus.com/blog/introducing-hand-tracking-on-oculus-quest-bringing-your-real-hands-into-vr/
[8]
Oculus Blog. 2019. Thumbs Up: Hand-tracking on Oculus Quest This Week. Retrieved from https://www.oculus.com/blog/thumbs-up-hand-tracking-now-available-on-oculus-quest/
[9]
RAZGRIZZZ DEMON. 2019. Robot Sphere. Retrieved from https://assetstore.unity.com/ packages/3d/characters/robots/robot-sphere-136226
[10]
Jaisie Sin and Cosmin Munteanu. 2020. Let's Go There: Combining Voice and Pointing in VR. In 2nd International Conference on Conversational User Interfaces (CUI 2020). https://doi.org/10.1145/1234567890
[11]
Jaisie Sin and Cosmin Munteanu. 2020. Let's Go There: Voice and Pointing Together in VR. In 22nd International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI ’20), 1–3. https://doi.org/10.1145/3406324.3410537

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
MobileHCI '21: Adjunct Publication of the 23rd International Conference on Mobile Human-Computer Interaction
September 2021
150 pages
ISBN:9781450383295
DOI:10.1145/3447527
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 September 2021

Check for updates

Author Tags

  1. Pointing
  2. Technology Adoption
  3. Virtual Agents
  4. Virtual Reality
  5. Voice Technology
  6. Voice User Interface

Qualifiers

  • Extended-abstract
  • Research
  • Refereed limited

Conference

MobileHCI '21
Sponsor:
MobileHCI '21: 23rd International Conference on Mobile Human-Computer Interaction
September 27 - October 1, 2021
Toulouse & Virtual, France

Acceptance Rates

Overall Acceptance Rate 202 of 906 submissions, 22%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 135
    Total Downloads
  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)1
Reflects downloads up to 16 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media