Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3020165.3020170acmconferencesArticle/Chapter ViewAbstractPublication PagesirConference Proceedingsconference-collections
research-article
Public Access

SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search

Published: 07 March 2017 Publication History

Abstract

We introduce SearchGazer, a web-based eye tracker for remote web search studies using common webcams already present in laptops and some desktop computers. SearchGazer is a pure JavaScript library that infers the gaze behavior of searchers in real time. The eye tracking model self-calibrates by watching searchers interact with the search pages and trains a mapping of eye features to gaze locations and search page elements on the screen. Contrary to typical eye tracking studies in information retrieval, this approach does not require the purchase of any additional specialized equipment, and can be done remotely in a user's natural environment, leading to cheaper and easier visual attention studies.
While SearchGazer is not intended to be as accurate as specialized eye trackers, it is able to replicate many of the research findings of three seminal information retrieval papers: two that used eye tracking devices, and one that used the mouse cursor as a restricted focus viewer. Charts and heatmaps from those original papers are plotted side-by-side with SearchGazer results. While the main results are similar, there are some notable differences, which we hypothesize derive from improvements in the latest ranking technologies used by current versions of search engines and diligence by remote users. As part of this paper, we also release SearchGazer as a library that can be integrated into any search page.

References

[1]
F. Alnajar, T. Gevers, R. Valenti, and S. Ghebreab. Calibration-free gaze estimation using human gaze patterns. In Proc. ICCV, pages 137--144, 2013.
[2]
P. Boi, G. Fenu, L. D. Spano, and V. Vargiu. Reconstructing user's attention on the web through mouse movements and perception-based content identification. ACM TAP, 13(3):15:1--15:21,2016.
[3]
G. Buscher, E. Cutrell, and M. R. Morris. What do you see when you're surfing?: using eye tracking to predict salient regions of web pages. In Proc. CHI, pages 21--30, 2009.
[4]
G. Buscher, A. Dengel, and L. van Elst. Eye movements as implicit relevance feedback. In Proc. CHI Extended Abstracts, pages 2991--2996, 2008.
[5]
G. Buscher, S. Dumais, and E. Cutrell. The good, the bad, and the random: an eye-tracking study of ad quality in web search. In Proc. SIGIR, pages 42--49, 2010.
[6]
G. Buscher, R. White, S. Dumais, and J. Huang. Large-scale analysis of individual and task differences in search result page examination strategies. In Proc. WSDM, pages 373--382, 2012.
[7]
O. Chapelle and Y. Zhang. A dynamic bayesian network click model for web search ranking. In Proc. WWW, pages 1--10, 2009.
[8]
E. Cutrell and Z. Guan. What are you looking for?: an eye-tracking study of information usage in web search. In Proc. CHI, pages 407--416, 2007.
[9]
S. Dumais, G. Buscher, and E. Cutrell. Individual differences in gaze patterns for web search. In Proc. IIiX, pages 185--194, 2010.
[10]
C. L. Folk, R. W. Remington, and J. H. Wright. The structure of attentional control: contingent attentional capture by apparent motion, abrupt onset, and color. Journal of Experimental Psychology: Human perception and performance, 20(2):317, 1994.
[11]
Q. Guo and E. Agichtein. Towards predicting web searcher gaze position from mouse movements. In Proc. CHI Extended Abstracts, pages 3601--3606, 2010.
[12]
D. W. Hansen and Q. Ji. In the eye of the beholder: A survey of models for eyes and gaze. IEEE TPAMI, 32(3):478--500, 2010.
[13]
D. Hauger, A. Paramythis, and S. Weibelzahl. Using browser interaction data to determine page reading behavior. In User Modeling, Adaption and Personalization, pages 147--158. Springer, 2011.
[14]
J. Hong, J. Ng, S. Lederer, and J. Landay. Privacy risk models for designing privacy-sensitive ubiquitous computing systems. In Proc. DIS, pages 91--100, 2004.
[15]
J. Huang, R. W. White, and G. Buscher. User see, user point: Gaze and cursor alignment in web search. In Proc. CHI, pages 1341--1350, 2012.
[16]
J. Huang, R. W. White, and S. Dumais. No clicks, no problem: using cursor movements to understand and improve search. In Proc. CHI, pages 1225--1234, 2011.
[17]
M. X. Huang, T. C. Kwok, G. Ngai, S. C. Chan, and H. V. Leong. Building a personalized, auto-calibrating eye tracker from user interactions. In Proc. CHI, pages 5169--5179, 2016.
[18]
T. Joachims, L. Granka, B. Pan, H. Hembrooke, and G. Gay. Accurately interpreting clickthrough data as implicit feedback. In Proc. SIGIR, pages 154--161, 2005.
[19]
T. Joachims, L. Granka, B. Pan, H. Hembrooke, F. Radlinski, and G. Gay. Evaluating the accuracy of implicit feedback from clicks and query reformulations in web search. ACM TOIS, 25(2):7, 2007.
[20]
B. Kules, R. Capra, M. Banta, and T. Sierra. What do exploratory searchers look at in a faceted search interface? In Proc. JCDL, pages 313--322, 2009.
[21]
D. Lagun and E. Agichtein. Viewser: Enabling large-scale remote user studies of web search examination and interaction. In Proc. SIGIR, pages 365--374, 2011.
[22]
D. Lagun and E. Agichtein. Inferring searcher attention by jointly modeling user interactions and content salience. In Proc. SIGIR, pages483--492,2015.
[23]
Y. Liu, Z. Liu, K. Zhou, M. Wang, H. Luan, C. Wang, M. Zhang, and S. Ma. Predicting search user examination with visual saliency. In Proc. SIGIR, pages 619--628, 2016.
[24]
Y. Liu, C. Wang, K. Zhou, J. Nie, M. Zhang, and S. Ma. From skimming to reading: A two-stage examination model for web search. In Proc. CIKM, pages 849--858, 2014.
[25]
A. Mathias. clmtrackr: Javascript library for precise tracking of facial features via Constrained Local Models. https://github.com/auduno/clmtrackr, 2014. {Online; accessed 2016-05-08}.
[26]
V. Navalpakkam, L. Jentzsch, R. Sayres, S. Ravi, A. Ahmed, and A. Smola. Measurement and modeling of eye-mouse behavior in the presence of nonlinear page layouts. In Proc. WWW, pages 953--964, 2013.
[27]
A. Papoutsaki, P. Sangkloy, J. Laskey, N. Daskalova, J. Huang, and J. Hays. Webgazer: Scalable webcam eye tracking using user interactions. In Proc. IJCAI, pages 3839--3845, 2016.
[28]
K. Rodden, X. Fu, A. Aula, and I. Spiro. Eye-mouse coordination patterns on web search results pages. In Proc. CHI Extended Abstracts, pages 2997--3002, 2008.
[29]
R. Srikant, S. Basu, N. Wang, and D. Pregibon. User browsing models: relevance versus examination. In Proc. SIGKDD, pages 223--232, 2010.
[30]
Y. Sugano, Y. Matsushita, and Y. Sato. Calibration-free gaze sensing using saliency maps. In Proc. CVPR, pages 2667--2674, 2010.
[31]
A. Wallar, A. Sazonovs, C. Poellabauer, and P. Flynn. Camgaze.js: Browser-based Eye Tracking and Gaze Prediction using JavaScript. http://aw204.host.cs.st-andrews.ac.uk/camgaze.js, 2014. {Online; accessed 2015-03-12}.
[32]
P. Xu, K. A. Ehinger, Y. Zhang, A. Finkelstein, S. R. Kulkarni, and J. Xiao. Turkergaze: Crowdsourcing saliency with webcam based eye tracking. arXiv preprint arXiv:1504.06755, 2015.
[33]
P. Xu, Y. Sugano, and A. Bulling. Spatio-temporal modeling and prediction of visual attention in graphical user interfaces. In Proc. CHI, pages 3299--3310, 2016.
[34]
P. Zielinski. Opengazer: open-source gaze tracker for ordinary webcams. http://www.inference.phy.cam.ac.uk/opengazer, 2007. {Online; accessed 2014-08-24}.

Cited By

View all
  • (2024)Predictive Gaze Analytics: A Comparative Case Study of the Foretelling Signs of User Performance during Interaction with Visualizations of Ontology Class HierarchiesMultimodal Technologies and Interaction10.3390/mti81000908:10(90)Online publication date: 12-Oct-2024
  • (2024)How Scientists Use Webcams to Track Human GazeFrontiers for Young Minds10.3389/frym.2024.125940412Online publication date: 9-Apr-2024
  • (2024)Evaluating the effectiveness of AI-based synchronous online feedback education systemThe Journal of Korean Association of Computer Education10.32431/kace.2024.27.7.00227:7(11-23)Online publication date: 31-Oct-2024
  • Show More Cited By

Index Terms

  1. SearchGazer: Webcam Eye Tracking for Remote Studies of Web Search

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHIIR '17: Proceedings of the 2017 Conference on Conference Human Information Interaction and Retrieval
    March 2017
    454 pages
    ISBN:9781450346771
    DOI:10.1145/3020165
    • Conference Chairs:
    • Ragnar Nordlie,
    • Nils Pharo,
    • Program Chairs:
    • Luanne Freund,
    • Birger Larsen,
    • Dan Russel
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 March 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. gaze prediction
    2. online eye tracking
    3. remote user studies
    4. user interactions
    5. web search behavior

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHIIR '17
    Sponsor:

    Acceptance Rates

    CHIIR '17 Paper Acceptance Rate 10 of 48 submissions, 21%;
    Overall Acceptance Rate 55 of 163 submissions, 34%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)491
    • Downloads (Last 6 weeks)55
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Predictive Gaze Analytics: A Comparative Case Study of the Foretelling Signs of User Performance during Interaction with Visualizations of Ontology Class HierarchiesMultimodal Technologies and Interaction10.3390/mti81000908:10(90)Online publication date: 12-Oct-2024
    • (2024)How Scientists Use Webcams to Track Human GazeFrontiers for Young Minds10.3389/frym.2024.125940412Online publication date: 9-Apr-2024
    • (2024)Evaluating the effectiveness of AI-based synchronous online feedback education systemThe Journal of Korean Association of Computer Education10.32431/kace.2024.27.7.00227:7(11-23)Online publication date: 31-Oct-2024
    • (2024)Design of a Personalized AI-based Synchronous Online Education Feedback System using Eye Tracking Data and Evaluation of Intention to UseThe Journal of Korean Association of Computer Education10.32431/kace.2024.27.1.00227:1(25-37)Online publication date: 31-Jan-2024
    • (2024)Analyzing Reading Patterns with Webcams: An Eye-Tracking Study of the Albanian Language2024 International Conference on Software, Telecommunications and Computer Networks (SoftCOM)10.23919/SoftCOM62040.2024.10721886(1-5)Online publication date: 26-Sep-2024
    • (2024)(The limits of) eye-tracking with iPadsJournal of Vision10.1167/jov.24.7.124:7(1)Online publication date: 2-Jul-2024
    • (2024)Learning to Rank for Maps at AirbnbProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671648(5061-5069)Online publication date: 25-Aug-2024
    • (2024)GazeTrak: Exploring Acoustic-based Eye Tracking on a Glass FrameProceedings of the 30th Annual International Conference on Mobile Computing and Networking10.1145/3636534.3649376(497-512)Online publication date: 29-May-2024
    • (2024)Eye-tracking AD: Cutting-Edge Web Advertising on Smartphone Aligned with User’s Gaze2024 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops)10.1109/PerComWorkshops59983.2024.10502602(469-474)Online publication date: 11-Mar-2024
    • (2024)A Digital Camera-Based Eye Movement Assessment Method for NeuroEye ExaminationIEEE Journal of Biomedical and Health Informatics10.1109/JBHI.2023.328594028:2(655-665)Online publication date: Feb-2024
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Login options

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media