Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

An assessment of eye-gaze potential within immersive virtual environments

Published: 12 December 2007 Publication History

Abstract

In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.

References

[1]
Argyle, M. 1975. Bodily Communication. Methuen, London.
[2]
Argyle, M. 1988. Bodily Communication, 2nd ed. Methuen, London.
[3]
Argyle, M. and Graham, J. 1977. The central Europe experiment---Looking at persons and looking at things. J. Environ. Psychol. Nonverbal Behav. 1, 1, 6--16.
[4]
Benford, S. 1995. User embodiment in collaborative virtual environments. In Proceedings of the Conference on Computer-Human Interaction. ACM Press, New York.
[5]
Bente, G. and Kraemer, N. C. 2002. Virtual gestures: Analyzing social presence effects of computermediated and computer-generated nonverbal behavior. In Proceedings of the 5th Annual International Workshop PRESENCE. MIT Press, Cambridge, 233--244.
[6]
Bruce, V. and Yound, A. 1998. In the Eye of the Beholder: The Science of Face Perception. Oxford University Press, Oxford.
[7]
Cruz-Neira, C., Sandin, D. J., and DeFanti, T. A. 1993. Surround-Screen projection-based virtual reality: The design and implementation of the CAVE. In Proceedings of the 20th Annual Conference on Computer Graphics. ACM Press, New York, 135--142.
[8]
Dickerson, P., Rae, J., Stribling, P., Dautenhahn, K., and Werry, I. 2005. Autistic children's co-ordination of gaze and talk: Re-Examining the ‘Asocial’ Autist. In Applying Conversation Analysis, K. Richards and P. Seedhouse, Eds. Palgrave Macmillan, Basingstoke, 19--37.
[9]
Garau, M., Slater, M., Bee, S., and Sasse, M. A. 2001. The impact of eye gaze on communication using humanoid avatars. In Proceedings of the SIG-CHI Conference on Human Factors in Computing Systems. ACM Press, New York.
[10]
Garau, M., Slater, M., Vinayagamoorhty, V., Brogni, A., Steed, A., and Sasse, M. A. 2003. The impact of avatar realism and eye gaze control on perceived quality of communication in a shared immersive virtual environment. In Proceedings of the SIG-CHI Conference on Human Factors in Computing Systems. ACM Press, New York.
[11]
Goodwin, C. 1980. Restarts, pauses and the achievement of mutual gaze at turn-beginning. Sociol. Inquiry 50, 272--302.
[12]
Goodwin, C. 1981. Conversational Organisation: Interaction Between Speakers and Hearers. Academic Press, New York.
[13]
Goodwin, C. 1986. Gestures as a resource for the organization of mutual orientation. Semiotica 62, 29--49.
[14]
Goodwin, C. 2000. Practices of seeing: Visual analysis: An ethnomethodological approach. In Handbook of Visual Analysis, T. van Leeuwen and C. Jewitt, Eds. Sage, London, 157--182.
[15]
Heath, C. 1984. Talk and recipiency: Sequential organization in speech and body movement. In Structures of Social Interaction: Studies in Conversation Analysis, J. Atkinson and J. Heritage, Eds. Cambridge University Press, New york, 247--265.
[16]
Heldal, I. 2005. Successes and failures in copresent situations. Presence: Teleoper. Virtual Environ. 14, 5, 563--579.
[17]
Hindmarsh, J. 2000. Object-Focused interaction in collaborative virtual environments. ACM Trans. Comput.-Hum. Interact. 7, 4, 477--509.
[18]
Kendon, A. 1967. Some functions of gaze-direction in social interaction. Acta Psychologica 26, 22--63.
[19]
Lee, S. H., Badler, J. B., and Badler, N. I. 2002. Eyes alive. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques. ACM Press, New York, 637--644.
[20]
Lerner, G. 1996. On the place of linguistic resources in the organization of talk in interaction: ‘Second person’ reference in multi-party conversation. Pragmatics 6, 281--294.
[21]
Lerner, G. 2003. Selecting next speaker: The context-sensitive operation of a context free organization. Lang. Soc. 32, 2, 177--201.
[22]
Markus, G. and Wurmlin, S. 2003. Blue-C: A spatially immersive display and 3D video portal for telepresence. In Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Tenchniques. ACM Press, New York.
[23]
Murray, N., Goulermas, J. Y., and Fernando, T. 2003. Visual tracking for a virtual environment. In Proceedings of Human Computer-Interaction International Conference, vol. 1. Lawrence Erlbaum Associates, NJ, 1198--1202.
[24]
Murray, N. and Roberts, D. 2006. Comparison of head gaze and head and eye gaze within an immersive environment. In the 10th IEEE International Symposium on Distributed Simulation and Real Time Applications. IEEE Computer Society, Los Alamitos, CA.
[25]
Rae, J. P. 2001. Organizing participation in interaction: Doing participation framework. Res. Lang. Soc. Inter. 34, 2, 253--278.
[26]
Raskar, R., Welch, G., Cutts, M., Lake, A., Stesin, L., and Fuchs, H. 1998. The office of the future: A unified approach to image-based modeling and spatially immersive displays. In Proceedings of the ACM SIGGRAPH International Conference on Computer Graphics and Interactive Tenchniques. ACM Press, New York.
[27]
Roberts, D. J., Wolff, R., Otto, O., Steed, A., Kranzlmueller, D., and Anthes, C. 2004. Supporting social human communication between distributed walk-in displays. In Proceedings of the ACM International Symposium on Virtual Reality, Systems and Technology. ACM Press, New York.
[28]
Schroeder, R., Steed, A., Axelsson, A. S., Heldal, I., Abelin, A., Widestrom, J., Nilsson, A., and Slater, M. 2001. Collaborating in networked immersive spaces: As good as being there together? Comput. Graphics 25, 5, 781--788.
[29]
Steed, A., Spante, M., Schroeder, R., Heldal, I., and Axelsson, A. S. 2003. Strangers and friends in caves: An exploratory study of collaboration in networked IPT systems for extended periods of time. In Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics. ACM Press, New York, 51--54.
[30]
Tanger, R., Kauff, P., and Schreer, O. 2004. Immersive meeting point (im.point)---An approach towards immersive media portals. In Proceedings of the Pacific-Rim Conference on Multimedia. Springer, Berlin.
[31]
Taylor, R. M., Hudson, T. C., Seeger, A., Weber, H., Juliano, J., and Helser, A. T. 2001. VRPN: A device-independent, network-transparent VR peripheral system. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology. ACM Press, New York.
[32]
Vinayagamoorhty, V., Garau, M., Steed, A., and Slater, M. 2004. An eye gaze model for dyadic interaction in an immersive virtual environment: Practice and experience. Comput. Graphics Forum 23, 1, 1--11.

Cited By

View all
  • (2024)Indicators Specification for Maturity Evaluation of BIM-Based VR/AR Systems Using ISO/IEC 15939 StandardExtended Reality10.1007/978-3-031-71707-9_19(249-258)Online publication date: 11-Sep-2024
  • (2020)Sharing gaze rays for visual target identification tasks in collaborative augmented realityJournal on Multimodal User Interfaces10.1007/s12193-020-00330-214:4(353-371)Online publication date: 9-Jul-2020
  • (2019)Eye gaze and head gaze in collaborative gamesProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317959.3321489(1-9)Online publication date: 25-Jun-2019
  • Show More Cited By

Index Terms

  1. An assessment of eye-gaze potential within immersive virtual environments

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Multimedia Computing, Communications, and Applications
    ACM Transactions on Multimedia Computing, Communications, and Applications  Volume 3, Issue 4
    December 2007
    147 pages
    ISSN:1551-6857
    EISSN:1551-6865
    DOI:10.1145/1314303
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 December 2007
    Accepted: 01 August 2007
    Received: 01 August 2007
    Published in TOMM Volume 3, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Immersive virtual environments
    2. eye gaze

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)18
    • Downloads (Last 6 weeks)3
    Reflects downloads up to 18 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Indicators Specification for Maturity Evaluation of BIM-Based VR/AR Systems Using ISO/IEC 15939 StandardExtended Reality10.1007/978-3-031-71707-9_19(249-258)Online publication date: 11-Sep-2024
    • (2020)Sharing gaze rays for visual target identification tasks in collaborative augmented realityJournal on Multimodal User Interfaces10.1007/s12193-020-00330-214:4(353-371)Online publication date: 9-Jul-2020
    • (2019)Eye gaze and head gaze in collaborative gamesProceedings of the 11th ACM Symposium on Eye Tracking Research & Applications10.1145/3317959.3321489(1-9)Online publication date: 25-Jun-2019
    • (2018)Effects of Hybrid and Synthetic Social Gaze in Avatar-Mediated Interactions2018 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)10.1109/ISMAR-Adjunct.2018.00044(103-108)Online publication date: Oct-2018
    • (2017)Here's Looking At You Anyway!Proceedings of the Annual Symposium on Computer-Human Interaction in Play10.1145/3116595.3116619(531-540)Online publication date: 15-Oct-2017
    • (2015)A Review of Eye Gaze in Virtual Agents, Social Robotics and HCIComputer Graphics Forum10.1111/cgf.1260334:6(299-326)Online publication date: 1-Sep-2015
    • (2015)Eye Gaze for Consumer Electronics: Controlling and commanding intelligent systems.IEEE Consumer Electronics Magazine10.1109/MCE.2015.24648524:4(65-71)Online publication date: Oct-2015
    • (2014)Using relative head and hand-target features to predict intention in 3D moving-target selection2014 IEEE Virtual Reality (VR)10.1109/VR.2014.6802050(51-56)Online publication date: Mar-2014
    • (2014)Measuring Eye Gaze Convergent Distance within Immersive Virtual EnvironmentsProcedia Engineering10.1016/j.proeng.2014.02.24069(333-339)Online publication date: 2014
    • (2014)Eye-Gaze Tracking-Based Telepresence System for VideoconferencingActive Media Technology10.1007/978-3-319-09912-5_36(432-441)Online publication date: 2014
    • Show More Cited By

    View Options

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media