Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Subtle gaze direction

Published: 08 September 2009 Publication History
  • Get Citation Alerts
  • Abstract

    This article presents a novel technique that combines eye-tracking with subtle image-space modulation to direct a viewer's gaze about a digital image. We call this paradigm subtle gaze direction. Subtle gaze direction exploits the fact that our peripheral vision has very poor acuity compared to our foveal vision. By presenting brief, subtle modulations to the peripheral regions of the field of view, the technique presented here draws the viewer's foveal vision to the modulated region. Additionally, by monitoring saccadic velocity and exploiting the visual phenomenon of saccadic masking, modulation is automatically terminated before the viewer's foveal vision enters the modulated region. Hence, the viewer is never actually allowed to scrutinize the stimuli that attracted her gaze. This new subtle gaze directing technique has potential application in many areas including large scale display systems, perceptually adaptive rendering, and complex visual search tasks.

    References

    [1]
    Bahrami, B., Lavie, N., and Rees, G. 2007. Attentional load modulates responses of human primary visual cortex to invisible stimuli. Current Biology 17, 6, 509--513.
    [2]
    Baudisch, P., DeCarlo, D., Duchowski, A., and Geisler, W. 2003. Focusing on the essential: considering attention in display design. Comm. ACM 46, 3, 60--66.
    [3]
    Cole, F., DeCarlo, D., Finkelstein, A., Kin, K., Morley, K., and Santella, A. 2006. Directing gaze in 3D models with stylized focus. In Proceedings of the Eurographics Symposium on Rendering. 377--387.
    [4]
    Collewign, H., Erkelens, C. J., and Steinman, R. M. 1988. Binocular co-ordination of human horizontal saccadic eye movements. J. Physiol. 404, 157--182.
    [5]
    DeCarlo, D. and Santella, A. 2002. Stylization and abstraction of photographs. In Proceedings of the 29th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH). ACM Press, New York, NY, 769--776.
    [6]
    Dodge, R. 1900. Visual perception during eye movement. Psychol. Rev. 7, 454--465.
    [7]
    Dodge, R. and Cline, T. S. 1901. The angle velocity of eye movements. Psychol. Rev. 8, 145--157.
    [8]
    Duchowski, A. T. 2002. A breadth-first survey of eye-tracking applications. Behav. Res. Meth. Instrum. Comput. 34, 4, 455--470.
    [9]
    Gajewski, D. A., Pearson, A., Mack, M. L., Bartlett III, F. N., and Henderson, J. M. 2005. Lecture Notes in Computer Science. vol. 3368/2005. Springer Berlin, Germany, 83--99.
    [10]
    Henderson, J. M. and Hollingworth, A. 1998. Eye movements during scene viewing: An overview. In Eye Guidance in Reading and Scene Perception, G. Underwood, Ed. Elsevier., 269--293.
    [11]
    Huey, E. B. 1968. The Psychology and Pedagogy of Reading. MIT Press, Cambridge, MA. (Originally published 1908).
    [12]
    Hurvich, L. M. and Jameson, D. 1957. An opponent-process theory of color vision. Psychol. Rev. 64, 384--404.
    [13]
    Hutchinson, T. E., White, K. P., Martin, W. N., Reichert, K. C., and Frey, L. A. 1989. Human-computer interaction using eye-gaze input. IEEE Trans. Syst. Man Cybernetics 19, 6, 1527--1534.
    [14]
    Jacob, R. J. K. and Karn, K. S. 2003. The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. Elsevier Science, Amsterdam, 573--605.
    [15]
    Kosara, R., Miksch, S., and Hauser, H. 2001. Semantic depth of field. In Proceedings of the IEEE Symposium on Information Visualization (INFOVIS). IEEE Computer Society, Los Alamitos, CA, 97.
    [16]
    Levine, J. L. 1981. An eye-controlled computer. Res. rep. RC-8857, IBM Thomas J. Watson Research Center, Yorktown Heights, N.Y.
    [17]
    Levoy, M. and Whitaker, R. 1990. Gaze-directed volume rendering. SIGGRAPH Comput. Graph. 24, 2, 217--223.
    [18]
    Livingstone, M. 2002. Vision and Art: The Biology of Seeing. Harry N. Abrams, Inc., New York, NY.
    [19]
    Luebke, D., Watson, B., Cohen, J. D., Reddy, M., and Varshney, A. 2002. Level of Detail for 3D Graphics. Elsevier Science Inc., New York, NY.
    [20]
    Mackworth, N. H. and Morandi, A. J. 1967. The gaze selects informative details within pictures. Percept. Psychophysics 2, 547--552.
    [21]
    Mannan, S. K., Ruddock, K. H., and Wooding, D. S. 1996. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spat. Vis. 10, 165--188.
    [22]
    McNamara, A., Bailey, R., and Grimm, C. 2008. Improving search task performance using subtle gaze direction. In Proceedings of the 5th Symposium on Applied Perception in Graphics and Visualization (APGV). ACM, New York, NY, 51--56.
    [23]
    Mitchell, G. E. 2004. Taking Control Over Depth of Field: Using the Lens Blur Filter in Adobe Photoshop CS. The Light's Right Studio. http://www.outbackphoto.com/workflow/wf_51/essay.html.
    [24]
    Ogden, T. E. and Miller, R. F. 1966. Studies of the optic nerve of the rhesus monkey: Nerve fiber spectrum and physiological properties. Vis. Res. 6, 485--506.
    [25]
    Osterberg, G. 1935. Topography of the layer of rods and cones in the human retina. Acta Ophthal. Suppl. 6, 11--97.
    [26]
    O'Sullivan, C., Dingliana, J., and Howlett, S. 2003. Eye-movements and interactive graphics. In The Mind's Eyes: Cognitive and Applied Aspects of Eye Movement Research, 555--571. J. Hyona, R. Radach, and H. Deubel, Eds., Elsevier Science, Oxford.
    [27]
    Parkhurst, D. and Niebur, E. 2003. Scene content selected by active vision. Spat. Vis. 16, 125--154.
    [28]
    Rayner, K. 1975. The perceptual span and peripheral cues in reading. Cog. Psychol. 7, 65--81.
    [29]
    Salvucci, D. D. and Goldberg, J. H. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research&Applications (ETRA). ACM, New York, NY, 71--78.
    [30]
    Spillmann, L. 1990. Visual Perception: The Neurophysiological Foundations. Academic Press, San Diego, CA.
    [31]
    Yarbus, A. L. 1967. Eye Movements and Vision. Plenum Press. (Translated from Russian by Basil Haigh. Original Russian edition published in Moscow in 1965.)
    [32]
    Young, R. A. 1987. The Gaussian derivative model for spatial vision: I. Retinal mechanisms. Spat. Vis. 2, 4, 273--293.

    Cited By

    View all
    • (2024)Synergy and medial effects of multimodal cueing with auditory and electrostatic force stimuli on visual field guidance in 360° VRFrontiers in Virtual Reality10.3389/frvir.2024.13793515Online publication date: 4-Jun-2024
    • (2024)Effective Assistance for Iterative Visual Search Tasks by Using Gaze-Based Visual Cognition Estimation視認推定に基づく注視誘導対象の切替による視覚探索タスクの支援Journal of Japan Society for Fuzzy Theory and Intelligent Informatics10.3156/jsoft.36.2_62336:2(623-630)Online publication date: 15-May-2024
    • (2024)GazeAway: Designing for Gaze Aversion ExperiencesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650771(1-6)Online publication date: 11-May-2024
    • Show More Cited By

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Graphics
    ACM Transactions on Graphics  Volume 28, Issue 4
    August 2009
    116 pages
    ISSN:0730-0301
    EISSN:1557-7368
    DOI:10.1145/1559755
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 September 2009
    Accepted: 01 June 2009
    Revised: 01 June 2008
    Received: 01 July 2007
    Published in TOG Volume 28, Issue 4

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Luminance
    2. eye-tracking
    3. image-based
    4. modulation
    5. visual acuity
    6. warm-cool

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)217
    • Downloads (Last 6 weeks)16
    Reflects downloads up to

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Synergy and medial effects of multimodal cueing with auditory and electrostatic force stimuli on visual field guidance in 360° VRFrontiers in Virtual Reality10.3389/frvir.2024.13793515Online publication date: 4-Jun-2024
    • (2024)Effective Assistance for Iterative Visual Search Tasks by Using Gaze-Based Visual Cognition Estimation視認推定に基づく注視誘導対象の切替による視覚探索タスクの支援Journal of Japan Society for Fuzzy Theory and Intelligent Informatics10.3156/jsoft.36.2_62336:2(623-630)Online publication date: 15-May-2024
    • (2024)GazeAway: Designing for Gaze Aversion ExperiencesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650771(1-6)Online publication date: 11-May-2024
    • (2024)Spatial Gaze Markers: Supporting Effective Task Switching in Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642811(1-11)Online publication date: 11-May-2024
    • (2024)MOSion: Gaze Guidance with Motion-triggered Visual Cues by Mosaic PatternsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642577(1-11)Online publication date: 11-May-2024
    • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
    • (2024)Cross-Reality Attention Guidance on the Light Field Display2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00201(809-810)Online publication date: 16-Mar-2024
    • (2024)Deceptive Patterns and Perceptual Risks in an Eye-Tracked Virtual Reality2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00068(341-344)Online publication date: 16-Mar-2024
    • (2024)Attention Guidance In The Wild: An Experiment Testing Visual Guidance Cues for VR Field Trips at High Schools2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00035(173-179)Online publication date: 16-Mar-2024
    • (2024)The Differential Effects of Multisensory Attentional Cues on Task Performance in VR Depending on the Level of Cognitive Load and Cognitive CapacityIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337212630:5(2703-2712)Online publication date: May-2024
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media