Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3126594.3126630acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

Published: 20 October 2017 Publication History

Abstract

While gaze holds a lot of promise for hands-free interaction with public displays, remote eye trackers with their confined tracking box restrict users to a single stationary position in front of the display. We present EyeScout, an active eye tracking system that combines an eye tracker mounted on a rail system with a computational method to automatically detect and align the tracker with the user's lateral movement. EyeScout addresses key limitations of current gaze-enabled large public displays by offering two novel gaze-interaction modes for a single user: In "Walk then Interact" the user can walk up to an arbitrary position in front of the display and interact, while in "Walk and Interact" the user can interact even while on the move. We report on a user study that shows that EyeScout is well perceived by users, extends a public display's sweet spot into a sweet line, and reduces gaze interaction kick-off time to 3.5 seconds -- a 62% improvement over state of the art solutions. We discuss sample applications that demonstrate how EyeScout can enable position and movement-independent gaze interaction with large public displays.

Supplementary Material

ZIP File (uistf3285-file4.zip)
suppl.mov (uistf3285-file3.mp4)
Supplemental video
MP4 File (p155-khamis.mp4)

References

[1]
Florian Alt, Andreas Bulling, Gino Gravanis, and Daniel Buschek. 2015. GravitySpot: Guiding Users in Front of Public Displays Using On-Screen Visual Cues. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, NY, NY, USA, 47--56.
[2]
Gilles Bailly, Sidharth Sahdev, Sylvain Malacria, and Thomas Pietrzak. 2016. LivingDesktop: Augmenting Desktop Workstation with Actuated Devices. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, NY, NY, USA, 5298--5310.
[3]
Hrvoje Benko, Andrew D. Wilson, and Ravin Balakrishnan. 2008. Sphere: Multi-touch Interactions on a Spherical Display. In Proceedings of the 21st Annual ACM Symposium on User Interface Software and Technology (UIST '08). ACM, NY, NY, USA, 77--86.
[4]
Gilbert Beyer, Florian Alt, Jörg Müller, Albrecht Schmidt, Karsten Isakovic, Stefan Klose, Manuel Schiewe, and Ivo Haulsen. 2011. Audience Behavior Around Large Interactive Cylindrical Screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11). ACM, NY, NY, USA, 1021--1030.
[5]
Gilbert Beyer, Florian Köttner, Manuel Schiewe, Ivo Haulsen, and Andreas Butz. 2013. Squaring the Circle: How Framing In?uences User Behavior Around a Seamless Cylindrical Display. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, NY, NY, USA, 1729--1738.
[6]
David Beymer and Myron Flickner. 2003. Eye gaze tracking using an active stereo head. In Computer Vision and Pattern Recognition, 2003. Proceedings. 2003 IEEE Computer Society Conference on, Vol. 2. II--451--8 vol.2.
[7]
John Bolton, Peng Wang, Kibum Kim, and Roel Vertegaal. 2012. BodiPod: Interacting with 3D Human Anatomy via a 360° Cylindrical Display. In CHI '12 Extended Abstracts on Human Factors in Computing Systems (CHI EA '12). ACM, NY, NY, USA, 1039--1042.
[8]
Harry Brignull and Yvonne Rogers. 2003. Enticing people to interact with large public displays in public spaces. In In Proceedings of the IFIP International Conference on Human-Computer Interaction (INTERACT 2003. 17--24.
[9]
Chao-Ning Chan, Shunichiro Oe, and Chern-Sheng Lin. 2007a. Active Eye-tracking System by Using Quad PTZ Cameras. In Industrial Electronics Society, 2007. IECON 2007. 33rd Annual Conference of the IEEE. 2389--2394.
[10]
Chao Ning Chan, Shunichiro Oe, and Chern-Sheng Lin. 2007b. Development of an active gaze tracking system in unrestricted posture. In International Conference on Control, Automation and Systems, 2007. ICCAS '07. 1348--1353.
[11]
Dong-Chan Cho and Whoi-Yul Kim. 2013. Long-Range Gaze Tracking System for Large Movements. IEEE Transactions on Biomedical Engineering 60, 12 (Dec 2013), 3432--3440.
[12]
Nicholas S. Dalton, Emily Collins, and Paul Marshall. 2015. Display Blindness?: Looking Again at the Visibility of Situated Displays Using Eye-tracking. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, NY, NY, USA, 3889--3898.
[13]
Nigel Davies, Sarah Clinch, and Florian Alt. 2014. Pervasive Displays: Understanding the Future of Digital Signage (1st ed.). Morgan & Claypool Publishers.
[14]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the Computer Using Gaze Gestures. Springer Berlin Heidelberg, Berlin, Heidelberg, 475--488.
[15]
Marc Eaddy, Gabor Blasko, Jason Babcock, and Steven Feiner. 2004. My own private kiosk: privacy-preserving public displays. In Eighth International Symposium on Wearable Computers, Vol. 1. 132--135.
[16]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, NY, NY, USA, 457--466.
[17]
Craig Hennessey and Jacob Fiset. 2012. Long Range Eye Tracking: Bringing Eye Tracking into the Living Room. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, NY, NY, USA, 249--252.
[18]
Axel Hoesl, Julie Wagner, and Andreas Butz. 2015. Delegation Impossible?: Towards Novel Interfaces for Camera Motion. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, NY, NY, USA, 1729--1734.
[19]
Shahram Jalaliniya and Diako Mardanbegi. 2016. EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, NY, NY, USA, 5801--5811.
[20]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A Field Study on Spontaneous Gaze-based Interaction with a Public Display Using Pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC'15 Adjunct). ACM, NY, NY, USA, 863--872.
[21]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2016a. Challenges and Design Space of Gaze-enabled Public Displays. In Ajunct Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, NY, NY, USA, 10.
[22]
Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016b. TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '16). ACM, NY, NY, USA, 12.
[23]
Mohamed Khamis, Ludwig Trotter, Markus Tessmann, Christina Dannhart, Andreas Bulling, and Florian Alt. 2016c. EyeVote in the Wild: Do Users Bother Correcting System Errors on Public Displays?. In Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia (MUM '16). ACM, NY, NY, USA, 57--62.
[24]
Christian Lander, Sven Gehring, Antonio Krüger, Sebastian Boring, and Andreas Bulling. 2015. GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, NY, NY, USA, 395--404.
[25]
Michael Mrochen, Mostafa Salah Eldine, Maik Kaemmerer, Theo Seiler, and Werner Htz. 2001. Improvement in photorefractive corneal laser surgery results using an active eye-tracking system. Journal of Cataract & Refractive Surgery 27, 7 (2001), 1000 -- 1006.
[26]
Jörg Müller, Florian Alt, Daniel Michelis, and Albrecht Schmidt. 2010. Requirements and Design Space for Interactive Public Displays. In Proceedings of the 18th ACM International Conference on Multimedia (MM '10). ACM, NY, NY, USA, 1285--1294.
[27]
Jörg Müller, Robert Walter, Gilles Bailly, Michael Nischt, and Florian Alt. 2012. Looking Glass: A Field Study on Noticing Interactivity of a Shop Window. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, NY, NY, USA, 297--306.
[28]
Borna Noureddin, Peter D. Lawrence, and C.F. Man. 2005. A non-contact device for tracking gaze in a human computer interface. Computer Vision and Image Understanding 98, 1 (2005), 52 -- 82. Special Issue on Eye Detection and Tracking.
[29]
Takehiko Ohno and Naoki Mukawa. 2004. A Free-head, Simple Calibration, Gaze Tracking System That Enables Gaze-based Interaction. In Proceedings of the 2004 Symposium on Eye Tracking Research & Applications (ETRA '04). ACM, NY, NY, USA, 115--122.
[30]
Constantin Schmidt, Jörg Müller, and Gilles Bailly. 2013. Screenfinity: Extending the Perception Area of Content on Very Large Public Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, NY, NY, USA, 1719--1728.
[31]
Julian Seifert, Sebastian Boring, Christian Winkler, Florian Schaub, Fabian Schwab, Steffen Herrdum, Fabian Maier, Daniel Mayer, and Enrico Rukzio. 2014. Hover Pad: Interacting with Autonomous and Self-actuated Displays in Space. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, NY, NY, USA, 139--147.
[32]
Yusuke Sugano, Xucong Zhang, and Andreas Bulling. 2016. AggreGaze: Collective Estimation of Audience Attention on Public Displays. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '16). ACM, NY, NY, USA.
[33]
Eduardo Velloso, Marcus Carter, Joshua Newn, Augusto Esteves, Christopher Clarke, and Hans Gellersen. 2017. Motion Correlation: Selecting Objects by Matching Their Movement. ACM Trans. Comput.-Hum. Interact. 24, 3, Article 22 (April 2017), 35 pages.
[34]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '13). ACM, NY, NY, USA, 439--448.
[35]
Robert Walter, Andreas Bulling, David Lindlbauer, Martin Schuessler, and Jörg Müller. 2015. Analyzing Visual Attention During Whole Body Interaction with Public Displays. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, NY, NY, USA, 1263--1267.
[36]
Miaosen Wang, Sebastian Boring, and Saul Greenberg. 2012. Proxemic Peddler: A Public Advertising Display That Captures and Preserves the Attention of a Passerby. In Proceedings of the 2012 International Symposium on Pervasive Displays (PerDis '12). ACM, NY, NY, USA, Article 3, 6 pages.
[37]
Julie R. Williamson, Daniel Sundén, and Jay Bradley. 2015a. GlobalFestival: Evaluating Real World Interaction on a Spherical Display. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp '15). ACM, NY, NY, USA, 1251--1261.
[38]
Julie R. Williamson, John Williamson, Daniel Sundén, and Jay Bradley. 2015b. Multi-Player Gaming on Spherical Displays. In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, NY, NY, USA, 355--358.
[39]
Lawrence H Yu and Moshe Eizenman. 2004. A new methodology for determining point-of-gaze in head-mounted eye tracking systems. IEEE Transactions on Biomedical Engineering 51, 10 (2004), 1765--1773.
[40]
Yanxia Zhang, Andreas Bulling, and Hans Gellersen. 2014. Pupil-canthi-ratio: A Calibration-free Method for Tracking Horizontal Gaze Direction. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14). ACM, NY, NY, USA, 129--132.
[41]
Yanxia Zhang, MingKi Chong, Jörg Müller, Andreas Bulling, and Hans Gellersen. 2015. Eye tracking for public displays in the wild. Personal and Ubiquitous Computing 19, 5--6 (2015), 967--981.

Cited By

View all
  • (2024)Navigating the Digital Public Sphere: An AI-Driven Analysis of Interaction Dynamics across Societal DomainsSocieties10.3390/soc1410019514:10(195)Online publication date: 26-Sep-2024
  • (2024)Exploring Pointer Enhancement Techniques for Target Selection on Large Curved DisplayProceedings of the ACM on Human-Computer Interaction10.1145/36981358:ISS(214-235)Online publication date: 24-Oct-2024
  • (2024)Effect of label elements in bottled water: impact on consumer preferences, purchase intentions and health perception through affective sensory testsHeliyon10.1016/j.heliyon.2024.e35106(e35106)Online publication date: Jul-2024
  • Show More Cited By

Index Terms

  1. EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    UIST '17: Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology
    October 2017
    870 pages
    ISBN:9781450349819
    DOI:10.1145/3126594
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 20 October 2017

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. body tracking
    2. gaze estimation
    3. gaze-enabled displays

    Qualifiers

    • Research-article

    Conference

    UIST '17

    Acceptance Rates

    UIST '17 Paper Acceptance Rate 73 of 324 submissions, 23%;
    Overall Acceptance Rate 561 of 2,567 submissions, 22%

    Upcoming Conference

    UIST '25
    The 38th Annual ACM Symposium on User Interface Software and Technology
    September 28 - October 1, 2025
    Busan , Republic of Korea

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)47
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 26 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Navigating the Digital Public Sphere: An AI-Driven Analysis of Interaction Dynamics across Societal DomainsSocieties10.3390/soc1410019514:10(195)Online publication date: 26-Sep-2024
    • (2024)Exploring Pointer Enhancement Techniques for Target Selection on Large Curved DisplayProceedings of the ACM on Human-Computer Interaction10.1145/36981358:ISS(214-235)Online publication date: 24-Oct-2024
    • (2024)Effect of label elements in bottled water: impact on consumer preferences, purchase intentions and health perception through affective sensory testsHeliyon10.1016/j.heliyon.2024.e35106(e35106)Online publication date: Jul-2024
    • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
    • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)User-Centered Evaluation of Different Configurations of a Touchless Gestural Interface for Interactive DisplaysHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_32(501-520)Online publication date: 25-Aug-2023
    • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
    • (2022)Model-based Gaze Estimation with Transparent Markers on Large ScreensProceedings of the ACM on Human-Computer Interaction10.1145/35308886:ETRA(1-16)Online publication date: 13-May-2022
    • (2022)Tracker/Camera Calibration for Accurate Automatic Gaze Annotation of Images and Videos2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529643(1-6)Online publication date: 8-Jun-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media