Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2702613.2725458acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
extended-abstract

Filteryedping: A Dwell-Free Eye Typing Technique

Published: 18 April 2015 Publication History

Abstract

The ability to type using eye gaze only is extremely important for individuals with a severe motor disability. To eye type, the user currently must sequentially gaze at letters in a virtual keyboard and dwell on each desired letter for a specific amount of time to input that key. Dwell-based eye typing has two possible drawbacks: unwanted input if the dwell threshold is too short or slow typing rates if the threshold is long. We demonstrate an eye typing technique, which does not require the user to dwell on the letters that she wants to input. Our method automatically filters out unwanted letters from the sequence of letters gazed at while typing a word. It ranks candidate words based on their length and frequency and presents them to the user for confirmation. Spell correction and support for typing words not in the corpus are also included.

Supplementary Material

suppl.mov (int0165-file3.mp4)
Supplemental video

References

[1]
Bee, N. and André, E. Writing with Your Eye: A Dwell Time Free Writing System Adapted to the Nature of Human Eye Gaze. In Proc. PIT 2008, Springer (2008), 111--122.
[2]
Chakraborty, T., Sarcar, S., and Samanta, D. Design and evaluation of a dwell-free eye typing technique. In Ext. Abstracts CHI 2014, ACM Press (2014), 1573--1578.
[3]
Davies, M. The corpus of contemporary American English (COCA): 450 million words, 1990--2012. http://www.americancorpus.org (2008).
[4]
Hoppe, S., Löchtefeld, M., and Daiber, F. Eype -- Using Eye-Traces for Eye-Typing. In Workshop on Grand Challenges in Text Entry (CHI 2013).
[5]
Isokoski, P. Text input methods for eye trackers using off-screen targets. In Proc. ETRA 2000, ACM Press (2000), 15--21.
[6]
Kristensson, P.O. and Vertanen, K. The potential of dwell-free eye-typing for fast assistive gaze communication. In Proc. ETRA 2012, ACM Press (2012) 241--244.
[7]
MacKenzie, I.S. and Soukoreff, R.W. Phrase sets for evaluating text entry techniques. In Ext. Abstracts CHI 2003, ACM Press (2003), 754--755.
[8]
Majaranta, P., Ahola, U.K., and Špakov, O. Fast gaze typing with an adjustable dwell time. In Proc. CHI 2009, ACM Press (2009), 357--360.
[9]
Morimoto, C.H. and Amir, A. Context switching for fast key selection in text entry applications. In Proc. ETRA 2010, ACM Press (2010), 271--274.
[10]
Pedrosa, D., Pimentel, M.G., Wright, A., and Truong, K.N. 2015. Filteryedping: Design challenges and user performance of dwell-free eye typing. TO APPEAR in TACCESS. 40 pages.
[11]
Räihä, K.J. and Ovaska, S. An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proc. CHI 2012, ACM Press (2012), 3001--3010.
[12]
Rough, D., Vertanen, K., and Kristensson, P.O. An evaluation of Dasher with a high-performance language model as a gaze communication method. In Proc. AVI 2014, ACM Press (2014), 169--176.
[13]
Ward, D.J., Blackwell, A.F., and MacKay, D.J.C. Dasher -- a data entry interface using continuous gestures and language models. In Proc. UIST 2000, ACM Press (2000), 129--137.
[14]
Wobbrock, J.O., Rubinstein, J., Sawyer, M.W., and Duchowski, A.T. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proc. ETRA 2008, ACM Press (2008), 11--18.

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2023)Understanding Adoption Barriers to Dwell-Free Eye-Typing: Design Implications from a Qualitative Deployment Study and Computational SimulationsProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584093(607-620)Online publication date: 27-Mar-2023
  • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI EA '15: Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems
April 2015
2546 pages
ISBN:9781450331463
DOI:10.1145/2702613
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 April 2015

Check for updates

Author Tags

  1. dwell-free
  2. eye typing
  3. gaze
  4. motor disability
  5. text entry

Qualifiers

  • Extended-abstract

Funding Sources

Conference

CHI '15
Sponsor:
CHI '15: CHI Conference on Human Factors in Computing Systems
April 18 - 23, 2015
Seoul, Republic of Korea

Acceptance Rates

CHI EA '15 Paper Acceptance Rate 379 of 1,520 submissions, 25%;
Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)0
Reflects downloads up to 14 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2023)Understanding Adoption Barriers to Dwell-Free Eye-Typing: Design Implications from a Qualitative Deployment Study and Computational SimulationsProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584093(607-620)Online publication date: 27-Mar-2023
  • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
  • (2022)“Sometimes I feel that I’m being left behind”: Exploring Computing Device Use by People with Upper Extremity Impairment During the COVID-19 PandemicExtended Abstracts of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491101.3519647(1-9)Online publication date: 27-Apr-2022
  • (2021)Hummer: Text Entry by Gaze and HumProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445501(1-11)Online publication date: 6-May-2021
  • (2021)“I...Got my Nose-Print. But it Wasn’t Accurate”: How People with Upper Extremity Impairment Authenticate on their Personal Computing DevicesProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445070(1-14)Online publication date: 6-May-2021
  • (2020)TAGSwipe: Touch Assisted Gaze Swipe for Text EntryProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376317(1-12)Online publication date: 21-Apr-2020
  • (2019)CamTypeMachine Vision and Applications10.1007/s00138-018-00997-430:3(407-421)Online publication date: 1-Apr-2019
  • (2018)Dwell time reduction technique using Fitts' law for gaze-based target acquisitionProceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications10.1145/3204493.3204532(1-7)Online publication date: 14-Jun-2018
  • (2017)Exploring the Design Space of AAC Awareness DisplaysProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025610(2890-2903)Online publication date: 2-May-2017
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media