Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3610978.3638158acmconferencesArticle/Chapter ViewAbstractPublication PageshriConference Proceedingsconference-collections
abstract

Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

Published: 11 March 2024 Publication History

Abstract

The 7th International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI) seeks to bring together researchers from human-robot interaction (HRI), robotics, and mixed reality (MR) to address the challenges related to mixed reality interactions between humans and robots. Key topics include the development of robots capable of interacting with humans in mixed reality, the use of virtual reality for creating interactive robots, designing augmented reality interfaces for communication between humans and robots, exploring mixed reality interfaces for enhancing robot learning, comparative analysis of the capabilities and perceptions of robots and virtual agents, and sharing best design practices. VAM-HRI 2024 will build on the success of VAM-HRI workshops held from 2018 to 2023, advancing research in this specialized community. The prior year's website is located at: https://vam-hri.github.io.

References

[1]
Gordon Briggs, Tom Williams, Ryan Blake Jackson, and Matthias Scheutz. Why and how robots should say "no'. International Journal of Social Robotics, 14(2):323--339, 2022.
[2]
Ravi Teja Chadalavada, Henrik Andreasson, Robert Krug, and Achim J Lilienthal. That's on my mind! robot to human intention communication through on-board projection on shared floor space. In ECMR, 2015.
[3]
Henrik Christensen, Nancy Amato, Holly Yanco, Maja Mataric, Howie Choset, Ann Drobnis, Ken Goldberg, Jessy Grizzle, Gregory Hager, John Hollerbach, et al. A roadmap for us robotics--from internet to robotics 2020 edition. Foundations and Trends® in Robotics, 8(4):307--424, 2021.
[4]
Thomas R Groechel, Michael E Walker, Christine T Chang, Eric Rosen, and Jessica Zosa Forde. A tool for organizing key characteristics of virtual, augmented, and mixed reality for human--robot interaction systems: Synthesizing vam-hri trends and takeaways. IEEE Robotics & Automation Magazine, 29(1):35--44, 2022.
[5]
Akkamahadevi Hanni and Yu Zhang. Generating active explicable plans in human-robot teaming. In 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pages 2993--2998, 2021.
[6]
Bryce Ikeda. Ar indicators for visually debugging robots. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 1161--1163, 2022.
[7]
Matthew B Luebbers, Aaquib Tabrez, Kyler Ruvane, and Bradley Hayes. Autonomous justification for enabling explainable decision support in human-robot teaming. 2023.
[8]
Emily McQuillin, Nikhil Churamani, and Hatice Gunes. Learning socially appropriate robo-waiter behaviours through real-time user feedback. In 2022 17th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 541--550, 2022.
[9]
Marco Moletta, Maciej K Wozniak, Michael C Welle, and Danica Kragic. A virtual reality framework for human-robot collaboration in cloth folding. arXiv preprint arXiv:2305.07493, 2023.
[10]
James F. Mullen, Josh Mosier, Sounak Chakrabarti, Anqi Chen, Tyler White, and Dylan P. Losey. Communicating inferred goals with passive augmented reality and active haptic feedback. IEEE Robotics and Automation Letters, 6(4):8522--8529, 2021.
[11]
Max Pascher, Til Franzen, Kirill Kronhardt, Uwe Gruenefeld, Stefan Schneegass, and Jens Gerken. Haptix: Vibrotactile haptic feedback for communication of 3d directional cues. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems, CHI EA '23, New York, NY, USA, 2023. Association for Computing Machinery.
[12]
Max Pascher, Felix Ferdinand Goldau, Kirill Kronhardt, Udo Frese, and Jens Gerken. AdaptiX -- A Transitional XR Framework for Development and Evaluation of Shared Control Applications in Assistive Robotics. Proc. ACM Hum.-Comput. Interact., 8(EICS), 2024. Preprint on arXiv: https://arxiv.org/abs/2310.15887.
[13]
Max Pascher, Uwe Gruenefeld, Stefan Schneegass, and Jens Gerken. How to communicate robot motion intent: A scoping review. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems - CHI '23, 2023.
[14]
Max Pascher, Kirill Kronhardt, and Jens Gerken. Physicaltwin: Mixed reality interaction environment for ai-supported assistive robots. In VAM-HRI '23: Workshop on Virtual, Augmented, and Mixed-Reality for Human-Robot Interactions at the ACM IEEE International Conference on Human-Robot Interaction, Stockholm, Sweden, 2023.
[15]
Max Pascher, Kirill Kronhardt, Felix Ferdinand Goldau, Udo Frese, and Jens Gerken. In Time and Space: Towards Usable Adaptive Control for Assistive Robotic Arms. In 2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pages 2300--2307. IEEE, 2023.
[16]
Christopher Reardon, Kevin Lee, John G. Rogers, and Jonathan Fink. Communicating via Augmented Reality for Human-Robot Teaming in Field Environments. In 2019 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), pages 94--101, September 2019. ISSN: 2475--8426.
[17]
Aaquib Tabrez, Matthew B Luebbers, and Bradley Hayes. A survey of mental modeling techniques in human--robot teaming. Current Robotics Reports, 1:259--267, 2020.
[18]
Aaquib Tabrez, Matthew B Luebbers, and Bradley Hayes. Descriptive and prescriptive visual guidance to improve shared situational awareness in human-robot teaming. In Proceedings of the 21st International Conference on Autonomous Agents and Multiagent Systems, pages 1256--1264, 2022.
[19]
Yi-Shiuan Tung, Matthew B Luebbers, Alessandro Roncone, and Bradley Hayes. Improving human legibility in collaborative robot tasks through augmented reality and workspace preparation. 2023.
[20]
Michael Walker, Hooman Hedayati, Jennifer Lee, and Daniel Szafir. Communicating robot motion intent with augmented reality. In Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, pages 316--324, 2018.
[21]
Atsushi Watanabe, Tetsushi Ikeda, Yoichi Morales, Kazuhiko Shinozawa, Takahiro Miyashita, and Norihiro Hagita. Communicating robotic navigational intentions. In IROS, pages 5763--5769. IEEE, 2015.
[22]
Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. The 1st international workshop on virtual, augmented, and mixed reality for human- robot interaction. AI Magazine, 2018.
[23]
Tom Williams, Daniel Szafir, Tathagata Chakraborti, and Heni Ben Amor. Virtual, augmented, and mixed reality for human-robot interaction. In Comp. HRI, 2018.
[24]
Maciej K Wozniak and Patric Jensfelt. Virtual reality framework for better human-robot collaboration and mutual understanding.
[25]
Maciej K Wozniak, Rebecca Stower, Patric Jensfelt, and Andre Pereira. Happily error after: Framework development and user study for correcting robot perception errors in virtual reality. arXiv preprint arXiv:2306.14589, 2023.
[26]
Maciej K Wozniak, Rebecca Stower, Patric Jensfelt, and Andre Pereira. What you see is (not) what you get: A vr framework for correcting robot errors. In Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, pages 243--247, 2023.

Index Terms

  1. Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    HRI '24: Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction
    March 2024
    1408 pages
    ISBN:9798400703232
    DOI:10.1145/3610978
    Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 11 March 2024

    Check for updates

    Author Tags

    1. AR
    2. MR
    3. VR
    4. human-robot interaction
    5. robotics

    Qualifiers

    • Abstract

    Conference

    HRI '24
    Sponsor:

    Acceptance Rates

    Overall Acceptance Rate 268 of 1,124 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 246
      Total Downloads
    • Downloads (Last 12 months)246
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 18 Jan 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media