Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3314111.3319838acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
short-paper

Calibration-free text entry using smooth pursuit eye movements

Published: 25 June 2019 Publication History
  • Get Citation Alerts
  • Abstract

    In this paper, we propose a calibration-free gaze-based text entry system that uses smooth pursuit eye movements. We report on our implementation, which improves over prior work on smooth pursuit text entry by 1) eliminating the need of calibration using motion correlation, 2) increasing input rate from 3.34 to 3.41 words per minute, 3) featuring text suggestions that were trained on 10,000 lexicon sentences recommended in the literature. We report on a user study (N=26) which shows that users are able to eye type at 3.41 words per minutes without calibration and without user training. Qualitative feedback also indicates that users positively perceive the system. Our work is of particular benefit for disabled users and for situations when voice and tactile input are not feasible (e.g., in noisy environments or when the hands are occupied).

    References

    [1]
    Nikolaus Bee and Elisabeth André. 2008. Writing with your eye: A dwell time free writing system adapted to the nature of human eye gaze. In International Tutorial and Research Workshop on Perception and Interactive Technologies for Speech-Based Systems. Springer, 111--122.
    [2]
    George S Benson. 2011. Popular and influential management books. Useful research: Advancing theory and practice (2011), 289--308.
    [3]
    Benjamin Blankertz, Matthias Krauledat, Guido Dornhege, John Williamson, Roderick Murray-Smith, and Klaus-Robert Müller. 2007. A note on brain actuated spelling with the Berlin brain-computer interface. In International Conference on Universal Access in Human-Computer Interaction. Springer, 759--768.
    [4]
    COCA. Featured on April 2018. Corpus of Contemporary American English.
    [5]
    Mohamed Khamis, Florian Alt, and Andreas Bulling. 2015. A field study on spontaneous gaze-based interaction with a public display using pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers. ACM, 863--872.
    [6]
    Mohamed Khamis, Ozan Saltuk, Alina Hang, Katharina Stolz, Andreas Bulling, and Florian Alt. 2016. TextPursuits: using text for pursuits-based interaction and calibration on public displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. ACM, 274--285.
    [7]
    Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1952--1956.
    [8]
    Michael F Land and Sophie Furneaux. 1997. The knowledge base of the oculomotor system. Philosophical Transactions of the Royal Society of London B: Biological Sciences 352, 1358 (1997), 1231--1239.
    [9]
    Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research 8, 1 (2015).
    [10]
    I. Scott MacKenzie and Shawn X. Zhang. 1999. The Design and Evaluation of a High-performance Soft Keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 25--31.
    [11]
    Päivi Majaranta. 2009. Text entry by eye gaze. Tampere University Press.
    [12]
    Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 357--360.
    [13]
    Math.NET. June, 2018. Math.NET. https://numerics.mathdotnet.com
    [14]
    Carlos H Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 symposium on eye-tracking research & applications. ACM, 271--274.
    [15]
    Alex Poole and Linden J Ball. 2006. Eye tracking in HCI and usability research. Encyclopedia of human computer interaction 1 (2006), 211--219.
    [16]
    Presage. June, 2018. Presage. https://presage.sourceforge.io
    [17]
    Stefan Schwarzkopf. 2015. Measurement devices and the psychophysiology of consumer behaviour: A posthuman genealogy of neuromarketing. BioSocieties 10, 4 (2015), 465--482.
    [18]
    Korok Sengupta, Raphael Menges, Chandan Kumar, and Steffen Staab. 2017. Gazethekey: Interactive keys to integrate word predictions for gaze-based text entry. In Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion. ACM, 121--124.
    [19]
    Outi Tuisku, Päivi Majaranta, Poika Isokoski, and Kari-Jouko Räihä. 2008. Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 19--26.
    [20]
    Antonio Diaz Tula, Filipe de Campos, and Carlos H Morimoto. 2012. Dynamic context switching for gaze based interaction. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 353--356.
    [21]
    Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. ACM, 439--448.
    [22]
    Michel Wedel and Rik Pieters. 2008. A review of eye-tracking research in marketing. In Review of marketing research. Emerald Group Publishing Limited, 123--147.

    Cited By

    View all
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)Beyond Aesthetics: Evaluating Response Widgets for Reliability & Construct Validity of Scale QuestionnairesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650751(1-7)Online publication date: 11-May-2024
    • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
    • Show More Cited By

    Index Terms

    1. Calibration-free text entry using smooth pursuit eye movements

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
      June 2019
      623 pages
      ISBN:9781450367097
      DOI:10.1145/3314111
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 25 June 2019

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. calibration-free
      2. gaze interaction
      3. smooth pursuits
      4. text entry

      Qualifiers

      • Short-paper

      Conference

      ETRA '19

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)34
      • Downloads (Last 6 weeks)2
      Reflects downloads up to 11 Aug 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)Beyond Aesthetics: Evaluating Response Widgets for Reliability & Construct Validity of Scale QuestionnairesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650751(1-7)Online publication date: 11-May-2024
      • (2024)FocusFlow: 3D Gaze-Depth Interaction in Virtual Reality Leveraging Active Visual Depth ManipulationProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642589(1-18)Online publication date: 11-May-2024
      • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
      • (2023)Enhancing Hybrid Eye Typing Interfaces with Word and Letter Prediction: A Comprehensive EvaluationInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2297113(1-13)Online publication date: 28-Dec-2023
      • (2022) SPEye: A Calibration-Free Gaze-Driven Text Entry Technique Based on Smooth Pursuit IEEE Transactions on Human-Machine Systems10.1109/THMS.2021.312320252:2(312-323)Online publication date: Apr-2022
      • (2022)A One-Point Calibration Design for Hybrid Eye Typing InterfaceInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210118639:18(3620-3633)Online publication date: 24-Jul-2022
      • (2022)Side Keyboard – the New Approach for Eye-typingProcedia Computer Science10.1016/j.procs.2022.09.393207:C(3348-3357)Online publication date: 1-Jan-2022
      • (2021)EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movementsACM Symposium on Eye Tracking Research and Applications10.1145/3448018.3458015(1-6)Online publication date: 25-May-2021
      • (2020)Voice as a Mouse Click: Usability and Effectiveness of Simplified Hands-Free Gaze-Voice SelectionApplied Sciences10.3390/app1024879110:24(8791)Online publication date: 9-Dec-2020
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media