Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3411763.3451545acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
demonstration

eyemR-Vis: A Mixed Reality System to Visualise Bi-Directional Gaze Behavioural Cues Between Remote Collaborators

Published: 08 May 2021 Publication History

Abstract

This demonstration shows eyemR-Vis, a 360 panoramic Mixed Reality collaboration system that translates gaze behavioural cues to bi-directional visualisations between a local host (AR) and a remote collaborator (VR). The system is designed to share dynamic gaze behavioural cues as bi-directional spatial virtual visualisations between a local host and a remote collaborator. This enables richer communication of gaze through four visualisation techniques: browse, focus, mutual-gaze, and fixated circle-map. Additionally, our system supports simple bi-directional avatar interaction as well as panoramic video zoom. This makes interaction in the normally constrained remote task space more flexible and relatively natural. By showing visual communication cues that are physically inaccessible in the remote task space through reallocating and visualising the existing ones, our system aims to provide a more engaging and effective remote collaboration experience.

Supplemental Material

MP4 File
Supplemental video
Transcript for: Supplemental video

References

[1]
Huidong Bai, Prasanth Sasikumar, Jing Yang, and Mark Billinghurst. 2020. A User Study on Mixed Reality Remote Collaboration with Eye Gaze and Hand Gesture Sharing. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems(CHI ’20). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3313831.3376550
[2]
Simon Baron-Cohen, Sally Wheelwright, and Therese Jolliffe. 1997. Is There a ”Language of the Eyes”? Evidence from Normal Adults, and Adults with Autism or Asperger Syndrome. Psychology Press Ltd. VISUAL COGNITION 4, 3 (1997), 311–331.
[3]
Roser Cañigueral and Antonia F.de C. Hamilton. 2019. The role of eye gaze during natural social interactions in typical and autistic people. Frontiers in Psychology 10, MAR (2019), 1–18. https://doi.org/10.3389/fpsyg.2019.00560
[4]
Sarah D’Angelo and Darren Gergle. 2018. An eye for design: Gaze visualizations for remote collaborative work. Conference on Human Factors in Computing Systems - Proceedings 2018-April (2018), 1–12. https://doi.org/10.1145/3173574.3173923
[5]
Jay S. Efran. 1968. Looking for approval: Effects on visual behavior of approbation from persons differing in importance. Journal of Personality and Social Psychology 10, 1 (9 1968), 21–25. https://doi.org/10.1037/h0026383
[6]
Barrett Ens, Joel Lanir, Anthony Tang, Scott Bateman, Gun Lee, Thammathip Piumsomboon, and Mark Billinghurst. 2019. Revisiting collaboration through mixed reality: The evolution of groupware. International Journal of Human Computer Studies 131, February(2019), 81–98. https://doi.org/10.1016/j.ijhcs.2019.05.011
[7]
Austin Erickson, Nahal Norouzi, Kangsoo Kim, Ryan Schubert, Jonathan Jules, Joseph J. LaViola, Gerd Bruder, and Gregory F. Welch. 2020. Sharing gaze rays for visual target identification tasks in collaborative augmented reality. Journal on Multimodal User Interfaces 14, 4 (2020), 353–371. https://doi.org/10.1007/s12193-020-00330-2
[8]
Kunal Gupta, Gun A. Lee, and Mark Billinghurst. 2016. Do you see what i see? the effect of gaze tracking on task space remote collaboration. IEEE Transactions on Visualization and Computer Graphics 22, 11(2016), 2413–2422. https://doi.org/10.1109/TVCG.2016.2593778
[9]
Simon Ho, Tom Foulsham, and Alan Kingstone. 2015. Speaking and Listening with the Eyes: Gaze Signaling during Dyadic Interactions. PLOS ONE 10, 8 (8 2015), e0136905. https://doi.org/10.1371/journal.pone.0136905
[10]
Seungwon Kim, Allison Jing, Hanhoon Park, Soo-hyung Kim, Gun Lee, and Mark Billinghurst. 2020. Use of Gaze and Hand Pointers in Mixed Reality Remote Collaboration. In The 9th International Conference on Smart Media and Applications. SMA, Jeju, Republic of Korea, 1–6.
[11]
S Kim, A Jing, H Park, G A Lee, W Huang, and M Billinghurst. 2020. Hand-in-Air (HiA) and Hand-on-Target (HoT) Style Gesture Cues for Mixed Reality Collaboration. IEEE Access 8(2020), 224145–224161. https://doi.org/10.1109/ACCESS.2020.3043783
[12]
Gun Lee, Seungwon Kim, Youngho Lee, Arindam Dey, Thammatip Piumsomboon, Mitchell Norman, and Mark Billinghurst. 2017. Improving Collaboration in Augmented Video Conference using Mutually Shared Gaze. In ICAT-EGVE 2017 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments. The Eurographics Association, Adelaide, Australia, 197–204. https://doi.org/10.2312/egve.20171359
[13]
Youngho Lee, Katsutoshi Masai, Kai Kunze, Maki Sugimoto, and Mark Billinghurst. 2016. A Remote Collaboration System with Empathy Glasses. In Adjunct Proceedings of the 2016 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2016. Institute of Electrical and Electronics Engineers Inc., Yucatan, Mexico, 342–343. https://doi.org/10.1109/ISMAR-Adjunct.2016.0112
[14]
Yuan Li, Feiyu Lu, Wallace S Lages, and Doug Bowman. 2019. Gaze Direction Visualization Techniques for Collaborative Wide-Area Model-Free Augmented Reality. In Symposium on Spatial User Interaction(SUI ’19). Association for Computing Machinery, New York, NY, USA, 1–11. https://doi.org/10.1145/3357251.3357583
[15]
Joshua Newn, Eduardo Velloso, Fraser Allison, Yomna Abdelrahman, and Frank Vetere. 2017. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play(CHI PLAY ’17). Association for Computing Machinery, New York, NY, USA, 541–552. https://doi.org/10.1145/3116595.3116624
[16]
Mohsen Parisay, Charalambos Poullis, and Marta Kersten-Oertel. 2020. FELiX: Fixation-based eye fatigue load index a multi-factor measure for gaze-based interactions. International Conference on Human System Interaction, HSI 2020-June, June(2020), 74–81. https://doi.org/10.1109/HSI49210.2020.9142677
[17]
Thammathip Piumsomboon, Arindam Dey, Barrett Ens, Gun Lee, and Mark Billinghurst. 2019. The effects of sharing awareness cues in collaborative mixed reality. Frontiers Robotics AI 6, FEB (2019), 1–18. https://doi.org/10.3389/frobt.2019.00005
[18]
Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings NA, NA(2017), 36–39. https://doi.org/10.1109/3DUI.2017.7893315
[19]
J. N. Vickers. 1995. Gaze control in basketball foul shooting. In Studies in Visual Information Processing. Vol. 6. Elsevier Science B.V., Amsterdam, Netherlands, 527–541. https://doi.org/10.1016/S0926-907X(05)80044-3
[20]
Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (2017), 173–186. https://doi.org/10.1007/s00779-016-0969-x

Cited By

View all
  • (2024)Field Trial of a Tablet-based AR System for Intergenerational Connections through Remote ReadingProceedings of the ACM on Human-Computer Interaction10.1145/36536968:CSCW1(1-28)Online publication date: 26-Apr-2024
  • (2022)The Impact of Sharing Gaze Behaviours in Collaborative Mixed RealityProceedings of the ACM on Human-Computer Interaction10.1145/35555646:CSCW2(1-27)Online publication date: 11-Nov-2022
  • (2022)Using Speech to Visualise Shared Gaze Cues in MR Remote Collaboration2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00044(250-259)Online publication date: Mar-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '21: Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems
May 2021
2965 pages
ISBN:9781450380959
DOI:10.1145/3411763
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 May 2021

Check for updates

Author Tags

  1. CSCW
  2. Gaze Visualisation
  3. Human-Computer Interaction
  4. Mixed Reality Remote Collaboration

Qualifiers

  • Demonstration
  • Research
  • Refereed limited

Conference

CHI '21
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)70
  • Downloads (Last 6 weeks)3
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Field Trial of a Tablet-based AR System for Intergenerational Connections through Remote ReadingProceedings of the ACM on Human-Computer Interaction10.1145/36536968:CSCW1(1-28)Online publication date: 26-Apr-2024
  • (2022)The Impact of Sharing Gaze Behaviours in Collaborative Mixed RealityProceedings of the ACM on Human-Computer Interaction10.1145/35555646:CSCW2(1-27)Online publication date: 11-Nov-2022
  • (2022)Using Speech to Visualise Shared Gaze Cues in MR Remote Collaboration2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)10.1109/VR51125.2022.00044(250-259)Online publication date: Mar-2022
  • (2022)Comparing Gaze-Supported Modalities with Empathic Mixed Reality Interfaces in Remote Collaboration2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00102(837-846)Online publication date: Oct-2022
  • (2021)eyemR-Talk: Using Speech to Visualise Shared MR Gaze CuesSIGGRAPH Asia 2021 Posters10.1145/3476124.3488618(1-2)Online publication date: 14-Dec-2021

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media