Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3485279.3485309acmconferencesArticle/Chapter ViewAbstractPublication PagessuiConference Proceedingsconference-collections
abstract

Altering Non-verbal Cues to Implicitly Direct Attention in Social VR

Published: 09 November 2021 Publication History

Abstract

In this work we explore a concept system that alters the virtual eye movements without the user’s awareness, and whether this can affect social attention among others. Our concept augments the real movements with subtle redirected gazes to people, that occur in intervals to remain unnoticed. We present a user study with groups of people conversing on a topic, and measure the level of visual attention among users. Compared to a baseline of natural eye movements, we find that the method has indeed affected the overall attention in the group, but in unexpected ways. Our work points to a new way to exploit the inherent role of eyes in social virtual reality.

References

[1]
Simon NB Gunkel, Hans M Stokking, Martin J Prins, Nanda van der Stap, Frank B ter Haar, and Omar A Niamut. 2018. Virtual Reality Conferencing: Multi-user immersive VR experiences on the web. In MM’Sys. 498–501.
[2]
David G Novick, Brian Hansen, and Karen Ward. 1996. Coordinating turn-taking with gaze. In ICSLP’96, Vol. 3. IEEE, 1888–1891.
[3]
Ken Pfeuffer, Benedikt Mayer, Diako Mardanbegi, and Hans Gellersen. 2017. Gaze + Pinch Interaction in Virtual Reality. In SUI. 99–108.
[4]
Radiah Rivu, Yasmeen Abdrabou, Ken Pfeuffer, Augusto Esteves, Stefanie Meitner, and Florian Alt. 2020. StARe: Gaze-Assisted Face-to-Face Communication in Augmented Reality. In ETRA ’20 Adjunct.
[5]
Daniel Roth, Gary Bente, Peter Kullmann, David Mal, Chris Felix Purps, Kai Vogeley, and Marc Erich Latoschik. 2019. Technologies for social augmentations in user-embodied virtual reality. In VRST. 1–12.
[6]
Daniel Roth, Peter Kullmann, Gary Bente, Dominik Gall, and Marc Erich Latoschik. 2018. Effects of hybrid and synthetic social gaze in avatar-mediated interactions. In ISMAR-Adjunct. IEEE, 103–108.
[7]
Roel Vertegaal, Robert Slagter, Gerrit van der Veer, and Anton Nijholt. 2001. Eye Gaze Patterns in Conversations: There is More to Conversational Agents Than Meets the Eyes. In CHI. 301–308.

Cited By

View all
  • (2024)Sensing the Intentions to Speak in VR Group DiscussionsSensors10.3390/s2402036224:2(362)Online publication date: 7-Jan-2024
  • (2024)Investigating the Gap: Gaze and Movement Analysis in Immersive EnvironmentsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653522(1-7)Online publication date: 4-Jun-2024
  • (2024)Cues to fast‐forward collaboration: A Survey of Workspace Awareness and Visual Cues in XR Collaborative SystemsComputer Graphics Forum10.1111/cgf.1506643:2Online publication date: 30-Apr-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SUI '21: Proceedings of the 2021 ACM Symposium on Spatial User Interaction
November 2021
206 pages
ISBN:9781450390910
DOI:10.1145/3485279
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 09 November 2021

Check for updates

Author Tags

  1. Collaboration
  2. Eye-tracking
  3. User Attention
  4. Virtual Reality

Qualifiers

  • Abstract
  • Research
  • Refereed limited

Conference

SUI '21
SUI '21: Symposium on Spatial User Interaction
November 9 - 10, 2021
Virtual Event, USA

Acceptance Rates

Overall Acceptance Rate 86 of 279 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)26
  • Downloads (Last 6 weeks)2
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Sensing the Intentions to Speak in VR Group DiscussionsSensors10.3390/s2402036224:2(362)Online publication date: 7-Jan-2024
  • (2024)Investigating the Gap: Gaze and Movement Analysis in Immersive EnvironmentsProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653522(1-7)Online publication date: 4-Jun-2024
  • (2024)Cues to fast‐forward collaboration: A Survey of Workspace Awareness and Visual Cues in XR Collaborative SystemsComputer Graphics Forum10.1111/cgf.1506643:2Online publication date: 30-Apr-2024
  • (2023)Guiding Visual Attention on 2D Screens: Effects of Gaze Cues from Avatars and HumansProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614529(1-9)Online publication date: 13-Oct-2023
  • (2023)Exploring the impact of non‐verbal cues on user experience in immersive virtual realityComputer Animation and Virtual Worlds10.1002/cav.222435:1Online publication date: 19-Dec-2023
  • (2022)Nonverbal Communication in Immersive Virtual Reality through the Lens of Presence: A Critical ReviewPRESENCE: Virtual and Augmented Reality10.1162/pres_a_0038731(147-187)Online publication date: 1-Dec-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media