Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2492494.2492508acmconferencesArticle/Chapter ViewAbstractPublication PagessapConference Proceedingsconference-collections
research-article

Guiding attention in controlled real-world environments

Published: 22 August 2013 Publication History

Abstract

The ability to direct a viewer's attention has important applications in computer graphics, data visualization, image analysis, and training. Existing computer-based gaze manipulation techniques, which direct a viewer's attention about a display, have been shown to be effective for spatial learning, search task completion, and medical training applications. In this work we extend the concept of gaze manipulation beyond digital imagery to include controlled, real-world environments. We address two main challenges in guiding attention to real-world objects: determining what object the viewer is currently paying attention to, and providing (projecting) a visual cue on a different part of the scene in order to draw the viewer's attention there. Our system consists of a pair of eye-tracking glasses to determine the viewer's gaze location, and a projector to create the visual cue in the physical environment. The results of a user study show that we can effectively direct the viewer's gaze in the real-world scene. Our technique has applicability in a wide range of instructional environments, including pilot training and driving simulators.

Supplementary Material

ZIP File (p75-booth.zip)
Supplemental material

References

[1]
Babcock, J. S., and Pelz, J. B. 2004. Building a lightweight eyetracking headgear. In Proceedings of the 2004 Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '04, 109--114.
[2]
Bailey, R., McNamara, A., Sudarsanam, N., and Grimm, C. 2009. Subtle gaze direction. ACM Trans. Graph. 28 (September), 100:1--100:14.
[3]
Bailey, R., McNamara, A., Costello, A., Sridharan, S., and Grimm, C. 2012. Impact of subtle gaze direction on short-term spatial information recall. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '12, 67--74.
[4]
Brockmole, J. R., and Henderson, J. M. 2006. Recognition and attention guidance during contextual cueing in real-world scenes: evidence from eye movements. Q J Exp Psychol (Hove) 59, 7 (Jul), 1177--87.
[5]
Chajka, K., Hayhoe, M., Sullivan, B., Pelz, J., Mennie, N., and Droll, J. 2006. Predictive eye movements in squash. Journal of Vision 6, 6, 481.
[6]
Chastine, J., Nagel, K., Zhu, Y., and Hudachek-Buswell, M. 2008. Studies on the effectiveness of virtual pointers in collaborative augmented reality. In Proceedings of the 2008 IEEE Symposium on 3D User Interfaces, IEEE Computer Society, Washington, DC, USA, 3DUI '08, 117--124.
[7]
Chen, X., and Zelinsky, G. J. 2006. Real-world visual search is dominated by top-down guidance. Vision research 46, 24 (Nov.), 4118--4133.
[8]
Cook, R. D. 1977. Detection of Influential Observation in Linear Regression. Technometrics 19, 1, 15--18.
[9]
DeCarlo, D., and Santella, A. 2002. Stylization and abstraction of photographs. In SIGGRAPH '02: Proceedings of the 29th annual conference on Computer graphics and interactive techniques, ACM Press, New York, NY, USA, 769--776.
[10]
Evans, K., Jacobs, R., Tarduno, J., and Pelz, J. 2012. Collecting and analyzing mobile eye-tracking data in outdoor environments. Journal of Eye Movement Research 5(2):6, 1--19.
[11]
Gauglitz, S., Lee, C., Turk, M., and Höllerer, T. 2012. Integrating the physical environment into mobile remote collaboration. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services, ACM, New York, NY, USA, MobileHCI '12, 241--250.
[12]
Grant, E., and Spivey, M. J. 2003. Eye movements and problem solving: guiding attention guides thought. Psychological Science 14, 5, 462--466.
[13]
Groen, M., and Noyes, J. 2010. Solving problems: How can guidance concerning task-relevancy be provided? Comput. Hum. Behav. 26 (November), 1318--1326.
[14]
Guennebaud, G., Jacob, B., et al., 2010. Eigen v3. http://eigen.tuxfamily.org.
[15]
Gurevich, P., Lanir, J., Cohen, B., and Stone, R. 2012. Teleadvisor: a versatile augmented reality tool for remote assistance. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, CHI '12, 619--622.
[16]
Henderson, J. M., and Hollingworth, A. 1998. Eye movements during scene viewing: An overview. In Eye Guidance in Reading and Scene Perception, G. Underwood, Ed. Oxford: Elsevier., 269--293.
[17]
Hutchinson, T. E., White, K. P., Martin, W. N., Reichert, K. C., and Frey, L. A. 1989. Human-computer interaction using eye-gaze input. Systems, Man and Cybernetics, IEEE Transactions on 19, 6, 1527--1534.
[18]
Jacob, R. J. K., and Karn, K. S. 2003. The Mind's Eye: Cognitive and Applied Aspects of Eye Movement Research. Elsevier Science, Amsterdam, ch. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises (Section Commentary), 573--605.
[19]
Jamet, E., Gavota, M., and Quaireau, C. 2008. Attention guiding in multimedia learning. Learning and Instruction 18, 2, 135--145.
[20]
Jonides, J. 1981. Voluntary versus automatic control over the mind?s eye's movement, vol. 9. Erlbaum, 187--203.
[21]
Klein, G., and Drummond, T. 2003. Robust visual tracking for non-instrumented augmented reality. In Proc. Second IEEE and ACM International Symposium on Mixed and Augmented Reality (ISMAR'03), 113--122.
[22]
Levenshtein, V. 1965. Binary codes capable of correcting spurious insertions and deletions of ones. Problems of Information Transmission 1, 8--17.
[23]
Levine, J. L. 1981. An eye-controlled computer. Research Report RC-8857, IBM Thomas J. Watson Research Center, Yorktown Heights, N.Y.
[24]
Li, R., Pelz, J., Shi, P., Alm, C. O., and Haake, A. R. 2012. Learning eye movement patterns for characterization of perceptual expertise. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '12, 393--396.
[25]
Litchfield, D., and Ball, L. J. 2011. Using another's gaze as an explicit aid to insight problem solving. Q J Exp Psychol (Hove) 64 (Apr), 649--656.
[26]
Litchfield, D., Ball, L. J., Donovan, T., Manning, D. J., and Crawford, T. 2010. Viewing another person's eye movements improves identification of pulmonary nodules in chest x-ray inspection. J Exp Psychol Appl 16 (Sep), 251--262.
[27]
Mackworth, N. H., and Morandi, A. J. 1967. The gaze selects informative details within pictures. Perception and Psychophysics 2, 547--552.
[28]
Mannan, S. K., Ruddock, K. H., and Wooding, D. S. 1996. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision 10, 165--188.
[29]
McNamara, A., Bailey, R., and Grimm, C. 2009. Search task performance using subtle gaze direction with the presence of distractions. ACM Trans. Appl. Percept. 6 (September), 17:1--17:19.
[30]
McNamara, A., Booth, T., Sridharan, S., Caffey, S., Grimm, C., and Bailey, R. 2012. Directing gaze in narrative art. In Proceedings of the ACM Symposium on Applied Perception, ACM, New York, NY, USA, SAP '12, 63--70.
[31]
O'Neill, E. C., Kong, Y. X. G., Connell, P. P., Ong, D. N., Haymes, S. A., Coote, M. A., and Crowston, J. G. 2011. Gaze behavior among experts and trainees during optic disc examination: does how we look affect what we see? Invest Ophthalmol Vis Sci 52, 7 (Jun), 3976--83.
[32]
Osterberg, G., 1935. Topography of the layer of rods and cones in the human retina. Acta Ophthal. suppl. 6, 11--97.
[33]
Parkhurst, D., and Niebur, E. 2003. Scene content selected by active vision. Spatial Vision 16, 125--154.
[34]
Pelz, J., Canosa, R., Kucharczyk, D., Babcock, J., Silver, A., and Konno, D. 2000. Portable eyetracking: A study of natural eye movements. In Human Vision and Electronic Imaging V, SPIE Proceedings, 3659.
[35]
Pelz, J., Kinsman, T., and Evans, K. 2011. Analyzing complex gaze behavior in the natural world. SPIE-IS&T Human Vision and Electronic Imaging XVI, 1--11.
[36]
Qvarfordt, P., Biehl, J. T., Golovchinsky, G., and Dunningan, T. 2010. Understanding the benefits of gaze enhanced visual search. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, ACM, New York, NY, USA, ETRA '10, 283--290.
[37]
Raskar, R., Low, K., and Welch, G. 2000. Shader lamps: Animating real objects with image-based illumination. Tech. rep., Chapel Hill, NC, USA.
[38]
Sadasivan, S., Greenstein, J. S., Gramopadhye, A. K., and Duchowski, A. T. 2005. Use of eye movements as feed-forward training for a synthetic aircraft inspection task. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, CHI '05, 141--149.
[39]
Sodhi, M., Reimer, B., Cohen, J. L., Vastenburg, E., Kaars, R., and Kirschenbaum, S. 2002. On-road driver eye movement tracking using head-mounted devices. In Proceedings of the 2002 symposium on Eye tracking research and applications, ACM, New York, NY, USA, ETRA '02, 61--68.
[40]
Sridharan, S., Bailey, R., McNamara, A., and Grimm, C. 2012. Subtle gaze manipulation for improved mammography training. In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, New York, NY, USA, ETRA '12, 75--82.
[41]
Tatler, B. W., Wade, N. J., Kwan, H., Findlay, J. M., and Velichkovsky, B. M. 2010. Yarbus, eye movements, and vision. i-Perception 1.
[42]
Thomas, L., and Lleras, A. 2007. Moving eyes and moving thought: on the spatial compatibility between eye movements and cognition. Psychonomic bulletin and review 14, 4, 663--668.
[43]
Torralba, A., Oliva, A., Castelhano, M. S., and Henderson, J. M. 2006. Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search. Psychological review 113, 4 (Oct.), 766--786.
[44]
Underwood, G., and Foulsham, T. 2006. Visual saliency and semantic incongruency influence eye movements when inspecting pictures. Q J Exp Psychol (Hove) 59, 11 (Nov), 1931--49.
[45]
Vaidyanathan, P., Pelz, J., Li, R., Mulpuru, S., Wang, D., Shi, P., Calvelli, C., and Haake, A. 2011. Using human experts' gaze data to evaluate image processing algorithms. In IVMSP Workshop, 2011 IEEE 10th, 129--134.
[46]
Veas, E. E., Mendez, E., Feiner, S. K., and Schmalstieg, D. 2011. Directing attention and influencing memory with visual saliency modulation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, New York, NY, USA, CHI '11, 1471--1480.
[47]
Vidal, M., Turner, J., Bulling, A., and Gellersen, H. 2012. Wearable eye tracking for mental health monitoring. Comput. Commun. 35, 11 (June), 1306--1311.
[48]
Wagner, D., Schmalstieg, D., and Bischof, H. 2009. Multiple target detection and tracking with guaranteed framerates on mobile phones. In Proceedings of the 2009 8th IEEE International Symposium on Mixed and Augmented Reality, IEEE Computer Society, Washington, DC, USA, ISMAR '09, 57--64.
[49]
Walther, D., Rutishauser, U., Koch, C., and Perona, P. 2005. Selective visual attention enables learning and recognition of multiple objects in cluttered scenes. Comput. Vis. Image Underst. 100, 1-2 (Oct.), 41--63.
[50]
Wang, R. F., and Spelke, E. S. 2002. Human spatial representation: insights from animals. Trends in Cognitive Sciences 6, 9, 376--382.
[51]
Wellner, P., and Freeman, S. 1993. The doubledigitaldesk: Shared editing of paper documents. Tech. Rep. Tech. Rep. EPC-93-108, Xerox EuroPARC.
[52]
Wu, C., 2007. SiftGPU: A GPU implementation of scale invariant feature transform (SIFT). http://cs.unc.edu/~ccwu/siftgpu.
[53]
Yarbus, A. L. 1967. Eye Movements and Vision. Plenum. New York.

Cited By

View all
  • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
  • (2023)Illustrative Motion Smoothing for Attention Guidance in Dynamic VisualizationsComputer Graphics Forum10.1111/cgf.1483642:3(361-372)Online publication date: 27-Jun-2023
  • (2022)Attentional Orienting in Front and Rear Spaces in a Virtual Reality Discrimination TaskVision10.3390/vision60100036:1(3)Online publication date: 6-Jan-2022
  • Show More Cited By

Recommendations

Reviews

Alyx Macfadyen

I found this paper somewhat disappointing. Although the work presented has a novel quality, it does not break much ground. The paper describes a quantifiable method to record and reuse an eye movement pattern, and presents a system that records a human gaze with the ultimate aim of directing the gaze of another human through a sequence of views. The authors suggest that this could be used in environments such as pilot training or any system where experienced professionals make complex decisions based on acquired knowledge. Tracking and recording the gaze of an experienced pilot or surgeon in critical scenarios, for example, could improve training and reflex actions through gaze-controlled instructional sessions based on these patterns. However, the authors also suggest persuasive advertising as another potential commercial application. This is a valid purpose, but a disappointing one to me, because I find manipulative commercial advertising less than pleasant. The study involved a control group with a paper list of objects and a group guided by visual cues. The results show that those in the guided group were able to identify relevant objects more quickly than those in the control group, who were delayed by having to shift their attention back and forth between the printed sheet and the objects presented, and other similar distractions. This work may be of interest in any area where persuasion or control of gaze has some application. Knowledge of eye-tracking technology would be necessary to harvest desirable gaze patterns. Online Computing Reviews Service

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SAP '13: Proceedings of the ACM Symposium on Applied Perception
August 2013
150 pages
ISBN:9781450322621
DOI:10.1145/2492494
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 August 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. SIFT
  2. eye-tracking
  3. gaze manipulation
  4. training

Qualifiers

  • Research-article

Funding Sources

Conference

SAP' 13
Sponsor:
SAP' 13: ACM Symposium on Applied Perception 2013
August 22 - 23, 2013
Dublin, Ireland

Acceptance Rates

SAP '13 Paper Acceptance Rate 22 of 54 submissions, 41%;
Overall Acceptance Rate 43 of 94 submissions, 46%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)40
  • Downloads (Last 6 weeks)5
Reflects downloads up to 30 Aug 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Flicker Augmentations: Rapid Brightness Modulation for Real-World Visual Guidance using Augmented RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642085(1-19)Online publication date: 11-May-2024
  • (2023)Illustrative Motion Smoothing for Attention Guidance in Dynamic VisualizationsComputer Graphics Forum10.1111/cgf.1483642:3(361-372)Online publication date: 27-Jun-2023
  • (2022)Attentional Orienting in Front and Rear Spaces in a Virtual Reality Discrimination TaskVision10.3390/vision60100036:1(3)Online publication date: 6-Jan-2022
  • (2022)Look over there! Investigating Saliency Modulation for Visual Guidance with Augmented Reality GlassesProceedings of the 35th Annual ACM Symposium on User Interface Software and Technology10.1145/3526113.3545633(1-15)Online publication date: 29-Oct-2022
  • (2022)Multimodal Augmented Reality and Subtle Guidance for Industrial Assembly – A Survey and Ideation MethodVirtual, Augmented and Mixed Reality: Applications in Education, Aviation and Industry10.1007/978-3-031-06015-1_23(329-349)Online publication date: 26-Jun-2022
  • (2021)Saliency-Aware Subtle Augmentation Improves Human Visual Search Performance in VRBrain Sciences10.3390/brainsci1103028311:3(283)Online publication date: 25-Feb-2021
  • (2021)To See or Not to SeeProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/34481235:1(1-25)Online publication date: 30-Mar-2021
  • (2021)Multi-modal Multi-scale Attention Guidance in Cyber-Physical EnvironmentsProceedings of the 26th International Conference on Intelligent User Interfaces10.1145/3397481.3450678(356-365)Online publication date: 14-Apr-2021
  • (2021)A privacy-preserving approach to streaming eye-tracking dataIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2021.306778727:5(2555-2565)Online publication date: May-2021
  • (2021)Augmentation Impacts Strategy and Gaze Distribution in a Dual-task Interleaving ScenarioInternational Journal of Human–Computer Interaction10.1080/10447318.2021.1948250(1-12)Online publication date: 30-Jul-2021
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media