Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2858036.2858517acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys

Published: 07 May 2016 Publication History

Abstract

Eye-typing is an important tool for people with physical disabilities and, for some, it is their main form of communication. By observing expert typists using physical keyboards, we notice that visual throughput is considerably reduced in current eye-typing solutions. We propose AugKey to improve throughput by augmenting keys with a prefix, to allow continuous text inspection, and suffixes to speed up typing with word prediction. AugKey limits the visual information to the foveal region to minimize eye movements (i.e., reduce eye work). We have applied AugKey to a dwell-time keyboard and compared its performance with two conditions with no augmented feedback: a keyboard with and one without word prediction. Results show that AugKey can be about 28% faster than no word prediction and 20% faster than traditional word prediction, with a smaller workload index.

Supplementary Material

suppl.mov (pn2399.mp4)
Supplemental video

References

[1]
Ahmed Sabbir Arif and Wolfgang Stuerzlinger. 2009. Analysis of text entry performance metrics. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). IEEE, 100-105.
[2]
Stuart K. Card, Allen Newell, and Thomas P. Moran. 1983. The Psychology of Human-Computer Interaction. L. Erlbaum Associates Inc., Hillsdale, NJ, USA.
[3]
Frank E. Ritter; Gordon D. Baxter; Elizabeth F. Churchill. 2014. Foundations for Designing User-Centered Systems: What System Designers Need to Know about People. Springer Publishing Company, Incorporated. 442 pages.
[4]
John Paulin Hansen, Kristian Tørning, Anders Sewerin Johansen, Kenji Itoh, and Hirotaka Aoki. 2004. Gaze typing compared with input by head and hand. In Proceedings of the 2004 symposium on Eye Tracking Research & Applications (ETRA '04). ACM, NY, NY, USA, 131-138.
[5]
Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. In Human Mental Workload, Peter A. Hancock and Najmedin Meshkati (Eds.). Advances in Psychology, Vol. 52. North-Holland, 139 - 183.
[6]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA '08). ACM, NY, NY, USA, 51-54.
[7]
Kenji Itoh, Hirotaka Aoki, and John Paulin Hansen. 2006. A Comparative Usability Study of Two Japanese Gaze Typing Systems. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications (ETRA '06). ACM, NY, NY, USA, 59-66.
[8]
H.H. Koester and S.P. Levine. 1994. Modeling the speed of text entry with a word prediction interface. Rehabilitation Engineering, IEEE Transactions on 2, 3 (Sep 1994), 177-187.
[9]
T.K. Landauer. 1987. Relations between cognitive psychology and computer systems design. (1987).
[10]
I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase Sets for Evaluating Text Entry Techniques. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, NY, NY, USA, 754-755.
[11]
I. Scott MacKenzie and Xuang Zhang. 2008. Eye typing using word and letter prediction and a fixation algorithm. In Proceedings of the 2008 symposium on Eye tracking research & applications (ETRA '08). ACM, NY, NY, USA, 55-58.
[12]
Paivi Majaranta. 2009. Text Entry by Eye Gaze. Ph.D. Dissertation. University of Tampere, Dept. of Computer Science, Tampere, Finland.
[13]
Paivi Majaranta, Ulla-Kaija Ahola, and Oleg Spakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the 27th international conference on Human factors in computing systems (CHI '09). ACM, NY, NY, USA, 357-360.
[14]
Paivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, and John Paulin Hansen. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (1st ed.). Information Science Reference Imprint of: IGI Publishing, Hershey, PA.
[15]
Paivi Majaranta, Scott MacKenzie, Anne Aula, and Kari-Jouko Raiha. 2006. Effects of feedback and dwell time on eye typing speed and accuracy. Univers. Access Inf. Soc. 5, 2 (July 2006), 199-208.
[16]
Paivi Majaranta and Kari-Jouko Raiha. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications (ETRA '02). ACM, NY, NY, USA, 15-22.
[17]
Carlos H. Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, NY, NY, USA, 271-274.
[18]
Marco Porta, Alice Ravarelli, and Giovanni Spagnoli. 2010. ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, NY, NY, USA, 331-337.
[19]
Samuel Pouplin, Johanna Robertson, Jean-Yves Antoine, Antoine Blanchet, Jean Loup Kahloun, Philippe Volle, Justine Bouteille, Frederic Lofaso, and Djamel Bensmail. 2014. Effect of dynamic keyboard and word-prediction systems on text input speed in persons with functional tetraplegia. Journal of rehabilitation research and development 51, 3 (2014), 467-79.
[20]
Kari-Jouko Raiha and Saila Ovaska. 2012. An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems (CHI '12). ACM, NY, NY, USA, 3001-3010.
[21]
Daniel Rough, Keith Vertanen, and Per Ola Kristensson. 2014. An Evaluation of Dasher with a High-performance Language Model As a Gaze Communication Method. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14). ACM, NY, NY, USA, 169-176.
[22]
Timothy a. Salthouse. 1986. Perceptual, cognitive, and motoric aspects of transcription typing. Psychological Bulletin 99, 3 (1986), 303-319.
[23]
Elizabeth R. Schotter, Bernhard Angele, and Keith Rayner. 2012. Parafoveal processing in reading. Attention, Perception, & Psychophysics 74, 1 (2012), 5-35.
[24]
R. William Soukoreff and I. Scott MacKenzie. 2001. Measuring Errors in Text Entry Tasks: An Application of the Levenshtein String Distance Statistic. In CHI '01 Extended Abstracts on Human Factors in Computing Systems (CHI EA '01). ACM, NY, NY, USA, 319-320.
[25]
Keith Trnka, John McCaw, Debra Yarrington, Kathleen F. McCoy, and Christopher Pennington. 2009. User Interaction with Word Prediction: The Effects of Prediction Quality. ACM Trans. Access. Comput. 1, 3, Article 17 (Feb. 2009), 34 pages.
[26]
Outi Tuisku, Paivi Majaranta, Poika Isokoski, and Kari-Jouko Raiha. 2008. Now Dasher! Dash Away!: Longitudinal Study of Fast Text Entry by Eye Gaze. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, NY, NY, USA, 19-26.
[27]
Mario H. Urbina and Anke Huckauf. 2010. Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, NY, NY, USA, 315-322.
[28]
Matteo Vescovi. 2004. Soothsayer: un sistema multi-sorgente per la predizione del testo. Master's thesis. Politecnico di Milano, Dipartimento di elettronica e informazione.
[29]
Oleg Spakov. 2013. Previewable scrolling by gaze. In Book of Abstracts of the 17th European Conference on Eye Movements, Vol. 6. K. Holmqvist, F. Mulvey & R. Johansson (Eds.), 469.
[30]
David J. Ward and David J. C. MacKay. 2002. Fast Hands-free Writing by Gaze Direction. Nature 418, 6900 (2002), 838.
[31]
Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA '08). ACM, NY, NY, USA, 11-18.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: May-2024
  • (2023)Vision-Based Interfaces for Character-Based Text EntryAdvances in Human-Computer Interaction10.1155/2023/88557642023Online publication date: 1-Jan-2023
  • Show More Cited By

Index Terms

  1. AugKey: Increasing Foveal Throughput in Eye Typing with Augmented Keys

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
    May 2016
    6108 pages
    ISBN:9781450333627
    DOI:10.1145/2858036
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 May 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. augmented feedback
    2. eye typing
    3. text-entry speed
    4. user experience
    5. word prediction

    Qualifiers

    • Research-article

    Funding Sources

    Conference

    CHI'16
    Sponsor:
    CHI'16: CHI Conference on Human Factors in Computing Systems
    May 7 - 12, 2016
    California, San Jose, USA

    Acceptance Rates

    CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)36
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 26 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
    • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: May-2024
    • (2023)Vision-Based Interfaces for Character-Based Text EntryAdvances in Human-Computer Interaction10.1155/2023/88557642023Online publication date: 1-Jan-2023
    • (2023)GlanceWriter: Writing Text by Glancing Over Letters with GazeProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581269(1-13)Online publication date: 19-Apr-2023
    • (2023)Detection of Dyslexic Children Using Machine Learning and Multimodal Hindi Language Eye-Gaze-Assisted Learning SystemIEEE Transactions on Human-Machine Systems10.1109/THMS.2022.322184853:1(122-131)Online publication date: Mar-2023
    • (2023)Eye-gesture-based multi-context interaction2023 IEEE 18th Conference on Industrial Electronics and Applications (ICIEA)10.1109/ICIEA58696.2023.10241952(429-434)Online publication date: 18-Aug-2023
    • (2022)Online eye-movement classification with temporal convolutional networksBehavior Research Methods10.3758/s13428-022-01978-255:7(3602-3620)Online publication date: 11-Oct-2022
    • (2022)PressTapFlickInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102787161:COnline publication date: 1-May-2022
    • (2021)HGaze Typing: Head-Gesture Assisted Gaze TypingACM Symposium on Eye Tracking Research and Applications10.1145/3448017.3457379(1-11)Online publication date: 25-May-2021
    • (2021)Nosype: A Novel Nose-tip Tracking-based Text Entry System for Smartphone Users with Clinical Disabilities for Touch-based TypingProceedings of the 23rd International Conference on Mobile Human-Computer Interaction10.1145/3447526.3472054(1-16)Online publication date: 27-Sep-2021
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media