Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2578153.2578185acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

EyeTab: model-based gaze estimation on unmodified tablet computers

Published: 26 March 2014 Publication History

Abstract

Despite the widespread use of mobile phones and tablets, hand-held portable devices have only recently been identified as a promising platform for gaze-aware applications. Estimating gaze on portable devices is challenging given their limited computational resources, low quality integrated front-facing RGB cameras, and small screens to which gaze is mapped. In this paper we present EyeTab, a model-based approach for binocular gaze estimation that runs entirely on an unmodified tablet. EyeTab builds on set of established image processing and computer vision algorithms and adapts them for robust and near-realtime gaze estimation. A technical prototype evaluation with eight participants in a normal indoors office setting shows that EyeTab achieves an average gaze estimation accuracy of 6.88° of visual angle at 12 frames per second.

References

[1]
Biedert, R., Buscher, G., Schwarz, S., Hees, J., and Dengel, A. 2010. Text 2.0. In Proc. CHI, 4003--4008.
[2]
Bulling, A., Roggen, D., and Tröster, G. 2011. What's in the eyes for context-awareness? IEEE Pervasive Computing.
[3]
Daugman, J. G. 1993. High confidence visual recognition of persons by a test of statistical independence. IEEE TPAMI.
[4]
Hansen, D. W., and Ji, Q. 2010. In the eye of the beholder: A survey of models for eyes and gaze. IEEE TPAMI.
[5]
Holland, C., Garza, A., Kurtova, E., Cruz, J., and Komogortsev, O. 2013. Usability evaluation of eye tracking on an unmodified common tablet. In Proc. CHI, 295--300.
[6]
Kunze, K., Ishimaru, S., Utsumi, Y., and Kise, K. 2013. My reading life: towards utilizing eyetracking on unmodified tablets and phones. In Proc. UbiComp, 283--286.
[7]
Miluzzo, E., Wang, T., and Campbell, A. T. 2010. Eye-phone: activating mobile phones with your eyes. In Proc. Mobi-Held, 15--20.
[8]
Oulasvirta, A., Tamminen, S., Roto, V., and Kuorelahti, J. 2005. Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile hci. In Proc. CHI.
[9]
Sesma, L., Villanueva, A., and Cabeza, R. 2012. Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. In Proc. ETRA, 217--220.
[10]
Świrski, L., Bulling, A., and Dodgson, N. 2012. Robust, real-time pupil tracking in highly off-axis images. In Proc. ETRA, 173--176.
[11]
Timm, F., and Barth, E. 2011. Accurate eye centre localisation by means of gradients. In Proc. VISAPP, 125--130.
[12]
Vaitukaitis, V., and Bulling, A. 2012. Eye gesture recognition on portable devices. In Proc. PETMEI, 711--714.
[13]
Valenti, R., Sebe, N., and Gevers, T. 2012. Combining head pose and eye location information for gaze estimation. IEEE Transactions on Image Processing 21, 2, 802--815.
[14]
Špakov, O. 2012. Comparison of eye movement filters used in hci. In Proc. ETRA, 281--284.
[15]
Wang, J.-G., Sung, E., and Venkateswarlu, R. 2003. Eye gaze estimation from a single image of one eye. In Proc. ICCV.
[16]
Wood, E., 2013. EyeTab source. http://www.cl.cam.ac.uk/research/rainbow/projects/eyetab/.
[17]
Wu, H., Chen, Q., and Wada, T. 2004. Conic-based algorithm for visual line estimation from one image. In Proc. AFGR.

Cited By

View all
  • (2025)Frequency-spatial interaction network for gaze estimationDisplays10.1016/j.displa.2024.10287886(102878)Online publication date: Jan-2025
  • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
  • (2024)UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity EnvironmentsProceedings of Mensch und Computer 202410.1145/3670653.3670655(1-10)Online publication date: 1-Sep-2024
  • Show More Cited By

Index Terms

  1. EyeTab: model-based gaze estimation on unmodified tablet computers

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '14: Proceedings of the Symposium on Eye Tracking Research and Applications
    March 2014
    394 pages
    ISBN:9781450327510
    DOI:10.1145/2578153
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 March 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. attentive user interfaces
    2. eye tracking
    3. gaze estimation
    4. gaze-based interfaces
    5. portable devices

    Qualifiers

    • Research-article

    Conference

    ETRA '14
    ETRA '14: Eye Tracking Research and Applications
    March 26 - 28, 2014
    Florida, Safety Harbor

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)80
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 24 Jan 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2025)Frequency-spatial interaction network for gaze estimationDisplays10.1016/j.displa.2024.10287886(102878)Online publication date: Jan-2025
    • (2024)CalibRead: Unobtrusive Eye Tracking Calibration from Natural Reading BehaviorProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36997378:4(1-30)Online publication date: 21-Nov-2024
    • (2024)UnitEye: Introducing a User-Friendly Plugin to Democratize Eye Tracking Technology in Unity EnvironmentsProceedings of Mensch und Computer 202410.1145/3670653.3670655(1-10)Online publication date: 1-Sep-2024
    • (2024)A Functional Usability Analysis of Appearance-Based Gaze Tracking for AccessibilityProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656363(1-7)Online publication date: 4-Jun-2024
    • (2024)Uncertainty Modeling for Gaze EstimationIEEE Transactions on Image Processing10.1109/TIP.2024.336453933(2851-2866)Online publication date: 2024
    • (2024)Robust Gaze Point Estimation for Metaverse With Common Mode Features Suppression NetworkIEEE Transactions on Consumer Electronics10.1109/TCE.2024.335119070:1(2090-2098)Online publication date: Feb-2024
    • (2024)Multistream Gaze Estimation With Anatomical Eye Region Isolation by Synthetic to Real Transfer LearningIEEE Transactions on Artificial Intelligence10.1109/TAI.2024.33661745:8(4232-4246)Online publication date: Aug-2024
    • (2024)Multi-Eyes: A Framework for Multi-User Eye-Tracking using Webcameras2024 IEEE International Conference on Information Reuse and Integration for Data Science (IRI)10.1109/IRI62200.2024.00069(308-313)Online publication date: 7-Aug-2024
    • (2024)Merging Multiple Datasets for Improved Appearance-Based Gaze EstimationPattern Recognition10.1007/978-3-031-78341-8_6(77-90)Online publication date: 2-Dec-2024
    • (2023)FreeGaze: Resource-efficient Gaze Estimation via Frequency-domain Contrastive LearningProceedings of the 2023 International Conference on embedded Wireless Systems and Networks10.5555/3639940.3639949(60-71)Online publication date: 15-Dec-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media