Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3476124.3488618acmconferencesArticle/Chapter ViewAbstractPublication Pagessiggraph-asiaConference Proceedingsconference-collections
poster

eyemR-Talk: Using Speech to Visualise Shared MR Gaze Cues

Published: 14 December 2021 Publication History

Abstract

In this poster we present eyemR-Talk, a Mixed Reality (MR) collaboration system that uses speech input to trigger shared gaze visualisations between remote users. The system uses 360° panoramic video to support collaboration between a local user in the real world in an Augmented Reality (AR) view and a remote collaborator in Virtual Reality (VR). Using specific speech phrases to turn on virtual gaze visualisations, the system enables contextual speech-gaze interaction between collaborators. The overall benefit is to achieve more natural gaze awareness, leading to better communication and more effective collaboration.

Supplementary Material

MP4 File (3476124.3488618.mp4)
presentation

References

[1]
Sarah D’Angelo and Andrew Begel. 2017. Improving Communication Between Pair Programmers Using Shared Gaze Awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems(CHI ’17). Association for Computing Machinery, New York, NY, USA, 6245–6290. https://doi.org/10.1145/3025453.3025573
[2]
Allison Jing, Kieran William May, Gun Lee, Mark Billinghurst, Mahnoor Naeem, Gun Lee, and Mark Billinghurst. 2021a. EyemR-Vis: A Mixed Reality System to Visualise Bi-Directional Gaze Behavioural Cues Between Remote Collaborators. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(CHI EA ’21). Association for Computing Machinery, New York, NY, USA, 19–22. https://doi.org/10.1145/3411763.3451545
[3]
Allison Jing, Kieran William May, Mahnoor Naeem, Gun Lee, and Mark Billinghurst. 2021b. EyemR-Vis: Using Bi-Directional Gaze Behavioural Cues to Improve Mixed Reality Remote Collaboration. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems(CHI EA ’21). Association for Computing Machinery, New York, NY, USA, 1–7. https://doi.org/10.1145/3411763.3451844
[4]
Gun A Lee, Theophilus Teo, Seungwon Kim, and Mark Billinghurst. 2018. A user study on mr remote collaboration using live 360 video. In 2018 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). IEEE, ISMAR, Munich, Germany, 153–164.

Cited By

View all
  • (2022)Near-Gaze Visualisations of Empathic Communication Cues in Mixed Reality CollaborationACM SIGGRAPH 2022 Posters10.1145/3532719.3543213(1-2)Online publication date: 27-Jul-2022
  • (2022)Using Speech to Visualise Shared Gaze Cues in MR Remote Collaboration2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00044(250-259)Online publication date: Mar-2022
  • (2022)Comparing Gaze-Supported Modalities with Empathic Mixed Reality Interfaces in Remote Collaboration2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00102(837-846)Online publication date: Oct-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SA '21 Posters: SIGGRAPH Asia 2021 Posters
December 2021
87 pages
ISBN:9781450386876
DOI:10.1145/3476124
  • Editors:
  • Shuzo John Shiota,
  • Ayumi Kimura,
  • Wan-Chun Alex Ma
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 14 December 2021

Check for updates

Author Tags

  1. Mixed Reality remote collaboration
  2. gaze visualization
  3. speech input

Qualifiers

  • Poster
  • Research
  • Refereed limited

Conference

SA '21
Sponsor:
SA '21: SIGGRAPH Asia 2021
December 14 - 17, 2021
Tokyo, Japan

Acceptance Rates

Overall Acceptance Rate 178 of 869 submissions, 20%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)30
  • Downloads (Last 6 weeks)1
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Near-Gaze Visualisations of Empathic Communication Cues in Mixed Reality CollaborationACM SIGGRAPH 2022 Posters10.1145/3532719.3543213(1-2)Online publication date: 27-Jul-2022
  • (2022)Using Speech to Visualise Shared Gaze Cues in MR Remote Collaboration2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00044(250-259)Online publication date: Mar-2022
  • (2022)Comparing Gaze-Supported Modalities with Empathic Mixed Reality Interfaces in Remote Collaboration2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00102(837-846)Online publication date: Oct-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media