Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2909132.2909254acmconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

Visual Guidance with Unnoticed Blur Effect

Published: 07 June 2016 Publication History

Abstract

In information media such as TV programs, digital signage, or web pages, information content providers often want to guide viewers' attention to a particular location of the display. However, "active" methods, such as flashing displays, using animation, or changing colors, often interrupt viewers' concentration and makes viewers feel annoyed. This paper proposes a method for guiding viewers' attention without viewers noticing. By focusing on a characteristic of the human visual system, we propose a dynamic blur control method. Our method gradually blurs the image on the display to the threshold at which viewers are aware of the modulation of the display, while the region where viewers' attention should be guided remains unblurred. Two subjective experiments were conducted to show the effectiveness of our method. In the first, viewers' attention was guided to the unblurred region using blur control. In the second, a threshold was found at which viewers were aware of the modulation, and viewers' gaze is guided below this threshold. This means that the viewers' attention can be guided without them noticing.

References

[1]
J. Abrams, A. Barmot, M. Carrasco: Voluntary attention increases perceived spatial frequency, Attention, Perception, & Psychophysics, pages 1510--1521, 2010.
[2]
R. Bailey, A. McNakara, N. Sudarsanam, C. Grimm: Subtle Gaze Direction, ACM Trans. on Graphics, Vol. 28 Issue 4, Article 100, August 2009.
[3]
R. Dodge: Visual perception during eye movement, Psychol. Review, 7, pp. 454--465, 1990.
[4]
J. Gobell, M. Carrasco: Attention alters the appearance of spatial frequency and gap size, Psychological Science, Vol. 16, pp. 644--651, 2005.
[5]
A. Hagiwara, A. Sugimoto, K. Kawamoto: Saliency-Based Image Editing for Guiding Visual Attention, PETMEI '11, pp. 43--48, 2011
[6]
J.M.Henderson, K. K. Mcclure, S.Pierce, G. Schrock: Object identification without foveal vision: Evidence from an artificial scooter paradigm, Perception & Pshchophysics, Vol. 59, Issue 3, pp. 323--346, 1997.
[7]
J.M.Henderson, A.Hollingworth, Eye movements during scene viewing: An overview, In Eye guidance in reading and scene perception, G.Underwood Ed. Elsevier, pp. 269--293, 1998.
[8]
A. Hollingworth, J.M.Henderson: Accurate visual memory for previously attended objects in natural scenes, Journal of Experimental Psychology: Human Perception and Performance, Vol. 28, pp. 113--136, 2002.
[9]
L. Itti, C. Koch, and E. Niebur: A Model of Saliency-Based Visual Attention for Rapid Scene Analysis, IEEE Trans. on Pattern Analysis and Machine Intelligence, pp. 1254--1259. 1998.
[10]
B. Julesz: A brief outline of the sexton theory of human vision, Trends in Neuroscience, Vol.7, pp. 41--45, 1984.
[11]
N. R. Kadaba, X.-D. Yang, P. P. Irani: Facilitating Multiple Target Tracking using Semantic Depth of Field (SDOF), CHI EA '09, pp.4375--4380, 2009.
[12]
J. C. Karremans, W. Stroebe, J. Claus: Beyond Vicary's fantasies: The impact of subliminal priming and brand choice, Journal of Experimental Social Psychology, pp. 792--798, 2006
[13]
N. H. Mackworth, A. J. Morandi: The gaze selects informative details within pictures, Perception Psychophysics 2, pp. 547--552, 1967.
[14]
T. Okatani, T. Ishizuka, K. Deguchi: Gaze-reactive image display for enhancing depth perception by depth-of field blur, Trans. on IEICE, Vol. J92-D, No.8, pp. 1298--1307, 2009. (in Japanese)
[15]
L. Spillmann: Visual Perception: The neurophysiological Foundations, Academic Press, 1990.
[16]
W. Stroebe: The Subtle Power of Hidden Messages, Scientific American Mind, Vol. 23, pp.46--51, 2012.
[17]
E. Veas, E. Mendez, S. Feiner, D. Schmalstieg: Directing Attention and Influencing Memory with Visual Saliency Modulation, CHI '11, pp. 1471--1480, 2011.
[18]
J. M. Wolfe: Guided search 2.0: A revised model of visual search. Psychonomic Bulletin and Review, Vol.1, pp. 202--238, 1994.
[19]
M. Yama, K. Sakamoto, K. Seki: Reaction time, encyclopedia of human's threshold, Asakura Publishing Co. (In Japanese)
[20]
A. L. Yarbus: Eye movement and vision, Plum Press, 1967.

Cited By

View all
  • (2024)Effective Assistance for Iterative Visual Search Tasks by Using Gaze-Based Visual Cognition Estimation視認推定に基づく注視誘導対象の切替による視覚探索タスクの支援Journal of Japan Society for Fuzzy Theory and Intelligent Informatics10.3156/jsoft.36.2_62336:2(623-630)Online publication date: 15-May-2024
  • (2024)MOSion: Gaze Guidance with Motion-triggered Visual Cues by Mosaic PatternsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642577(1-11)Online publication date: 11-May-2024
  • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
  • Show More Cited By
  1. Visual Guidance with Unnoticed Blur Effect

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    AVI '16: Proceedings of the International Working Conference on Advanced Visual Interfaces
    June 2016
    400 pages
    ISBN:9781450341318
    DOI:10.1145/2909132
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 June 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attentive user interfaces
    2. dynamic blur control
    3. gaze direction
    4. visual saliency

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AVI '16

    Acceptance Rates

    AVI '16 Paper Acceptance Rate 20 of 96 submissions, 21%;
    Overall Acceptance Rate 128 of 490 submissions, 26%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)90
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 18 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Effective Assistance for Iterative Visual Search Tasks by Using Gaze-Based Visual Cognition Estimation視認推定に基づく注視誘導対象の切替による視覚探索タスクの支援Journal of Japan Society for Fuzzy Theory and Intelligent Informatics10.3156/jsoft.36.2_62336:2(623-630)Online publication date: 15-May-2024
    • (2024)MOSion: Gaze Guidance with Motion-triggered Visual Cues by Mosaic PatternsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642577(1-11)Online publication date: 11-May-2024
    • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
    • (2024)“May I Speak?”: Multi-Modal Attention Guidance in Social VR Group ConversationsIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337211930:5(2287-2297)Online publication date: 7-Mar-2024
    • (2022)Application of Spatial Cues and Optical Distortions as Augmentations during Virtual Reality (VR) Gaming: The Multifaceted Effects of Assistance for Eccentric Viewing TrainingInternational Journal of Environmental Research and Public Health10.3390/ijerph1915957119:15(9571)Online publication date: 4-Aug-2022
    • (2022)Look over there! Investigating Saliency Modulation for Visual Guidance with Augmented Reality GlassesProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545633(1-15)Online publication date: 29-Oct-2022
    • (2022)Visual Delegate Generalization Frame – Evaluating Impact of Visual Effects and Elements on Player and User Experiences in Video Games and Interactive Virtual EnvironmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501885(1-20)Online publication date: 29-Apr-2022
    • (2022)Programmable Peripheral Vision: augment/reshape human visual perceptionExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3503821(1-5)Online publication date: 27-Apr-2022
    • (2022)GazeSync: Eye Movement Transfer Using an Optical Eye Tracker and Monochrome Liquid Crystal DisplaysCompanion Proceedings of the 27th International Conference on Intelligent User Interfaces10.1145/3490100.3516469(54-57)Online publication date: 22-Mar-2022
    • (2022)Designing User-Guidance for eXtendend Reality Interfaces in Industrial EnvironmentsHuman-Technology Interaction10.1007/978-3-030-99235-4_7(173-198)Online publication date: 14-Dec-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media