Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

40 Years of Eye Typing: Challenges, Gaps, and Emergent Strategies

Published: 28 May 2024 Publication History

Abstract

Gaze interaction enables users to communicate through eye tracking, and is often the only channel of effective and efficient communication for individuals with severe motor disabilities. While there has been significant research and development of eye typing systems, in the context of augmentative and alternative communication (AAC), there is no comprehensive review that integrates the key findings from the variety of aspects that constitute the complex landscape of gaze communication. This paper presents a detailed review and characterization of the literature and aims to consolidate the disparate efforts to provide eye typing solutions for AAC users. We provide a systematic understanding of the components and functionalities that underpin eye typing solutions, and analyze the interplay of the different facets and their role in shaping the user-experience, accessibility, performance, and overall effectiveness of eye typing technology. We also identify the major challenges and highlight several areas that require further research attention.

References

[1]
Yasmeen Abdrabou, Mariam Mostafa, Mohamed Khamis, and Amr Elmougy. 2019. Calibration-free text entry using smooth pursuit eye movements. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1--5.
[2]
Kiyohiko Abe, Shoichi Ohi, and Minoru Ohyama. 2007. An eye-gaze input system using information on eye movement history. In Universal Access in Human-Computer Interaction. Ambient Interaction: 4th International Conference on Universal Access in Human-Computer Interaction, UAHCI 2007 Held as Part of HCI International 2007 Beijing, China, July 22--27, 2007 Proceedings, Part II 4. Springer, 721--729.
[3]
Hirotaka Aoki, John Paulin Hansen, and Kenji Itoh. 2008. Learning to interact with a computer by gaze. Behaviour & Information Technology, Vol. 27, 4 (2008), 339--344.
[4]
Hirotaka Aoki, John Paulin Hansen, and Kenji Itoh. 2009. Learning gaze typing: what are the obstacles and what progress to expect? Universal Access in the Information Society, Vol. 8 (2009), 297--310.
[5]
Tanya Bafna. 2018. Gaze typing using multi-key selection technique. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility. 477--479.
[6]
Tanya Bafna, Per Bækgaard, and John Paulin Paulin Hansen. 2021. EyeTell: Tablet-based Calibration-free Eye-typing using Smooth-pursuit movements. In ACM Symposium on Eye Tracking Research and Applications. 1--6.
[7]
Rita L Bailey, Howard P ParetteJr, Julia B Stoner, Maureen E Angell, and Kathleen Carroll. 2006. Family members' perceptions of augmentative and alternative communication device use. (2006).
[8]
Richard Bates, Michael Donegan, Howell O Istance, John Paulin Hansen, and K-J R"aih"a. 2007. Introducing COGAIN: communication by gaze interaction. Universal Access in the Information Society, Vol. 6 (2007), 159--166.
[9]
Burak Benligiray, Cihan Topal, and Cuneyt Akinlar. 2019. Slicetype: fast gaze typing with a merging keyboard. Journal on Multimodal User Interfaces, Vol. 13 (2019), 321--334.
[10]
Darrell S Best and Andrew T Duchowski. 2016. A rotary dial for gaze-based pin entry. In Proceedings of the ninth biennial acm symposium on eye tracking research & applications. 69--76.
[11]
David Beukelman. 1991. Magic and cost of communicative competence. Augmentative and alternative communication, Vol. 7, 1 (1991), 2--10.
[12]
Maria Borgestig, Jan Sandqvist, Richard Parsons, Torbjörn Falkmer, and Helena Hemmingsson. 2016. Eye gaze performance for children with severe physical impairments using gaze-based assistive technology-A longitudinal study. Assistive technology, Vol. 28, 2 (2016), 93--102.
[13]
Daniel Buschek, Benjamin Bisinger, and Florian Alt. 2018. ResearchIME: A mobile keyboard application for studying free typing behaviour in the wild. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 1--14.
[14]
Shanqing Cai, Subhashini Venugopalan, Katrin Tomanek, Shaun Kane, Meredith Ringel Morris, Richard Cave, Robert Macdonald, Jon Campbell, Blair Casey, Emily Kornman, et al. 2023. SpeakFaster Observer: Long-Term Instrumentation of Eye-Gaze Typing for Measuring AAC Communication. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems. 1--8.
[15]
Marco Caligari, Marica Giardini, Ilaria Arcolin, Marco Godi, Stefano Corna, and Roberto Colombo. 2021. Writing with the eyes: the effect of age on eye-tracking performance in non-disabled adults and a comparison with bimanual typing. Computational intelligence and neuroscience, Vol. 2021 (2021).
[16]
Hubert Cecotti, Yogesh Kumar Meena, and Girijesh Prasad. 2018. A multimodal virtual keyboard using eye-tracking and hand gesture detection. In 2018 40th Annual international conference of the IEEE engineering in medicine and biology society (EMBC). IEEE, 3330--3333.
[17]
Michael Cross, Leping Qiu, Mingyuan Zhong, Yuntao Wang, and Yuanchun Shi. 2022. One-Dimensional Eye-Gaze Typing Interface for People with Locked-in Syndrome. In Adjunct Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology. 1--3.
[18]
Wenzhe Cui, Rui Liu, Zhi Li, Yifan Wang, Andrew Wang, Xia Zhao, Sina Rashidian, Furqan Baig, IV Ramakrishnan, Fusheng Wang, et al. 2023. GlanceWriter: Writing Text by Glancing Over Letters with Gaze. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1--13.
[19]
Antonio Diaz Tula, Andrew TN Kurauchi, and Carlos H Morimoto. 2013. Facilitating gaze interaction using the gap and overlap effects. In CHI'13 Extended Abstracts on Human Factors in Computing Systems. 91--96.
[20]
Antonio Diaz-Tula and Carlos H Morimoto. 2016. Augkey: Increasing foveal throughput in eye typing with augmented keys. In Proceedings of the 2016 CHI conference on human factors in computing systems. 3533--3544.
[21]
Mick Donegan, Jeffrey D Morris, Fulvio Corno, Isabella Signorile, Adriano Chió, Valentina Pasian, Alessandro Vignola, Margret Buchholz, and Eva Holmqvist. 2009. Understanding users and their needs. Universal Access in the Information Society, Vol. 8 (2009), 259--275.
[22]
T Andrew Duchowski. 2017. Eye tracking: methodology theory and practice. Springer.
[23]
Chris Elliott and Peter Deasley. 2007. Creating systems that work: Principles of engineering systems for the 21st century. R Acad Eng, Vol. 293074 (2007).
[24]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 Chi conference on human factors in computing systems. 1118--1130.
[25]
Misahael Fernandez, Florian Mathis, and Mohamed Khamis. 2021. GazeWheels: Recommendations for using wheel widgets for feedback during dwell-time gaze input. it-Information Technology, Vol. 63, 3 (2021), 145--156.
[26]
Alexander Fiannaca, Ann Paradiso, Mira Shah, and Meredith Ringel Morris. 2017. AACrobat: Using mobile devices to lower communication barriers and provide autonomy with gaze-based AAC. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. 683--695.
[27]
Dan Witzner Hansen, Henrik HT Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications. 205--212.
[28]
John Paulin Hansen, Javier San Agustin, and Henrik Skovsgaard. 2011. Gaze interaction from bed. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. 1--4.
[29]
John Paulin Hansen, Kristian Tørning, Anders Sewerin Johansen, Kenji Itoh, and Hirotaka Aoki. 2004. Gaze typing compared with input by head and hand. In Proceedings of the 2004 symposium on Eye tracking research & applications. 131--138.
[30]
Katarzyna Harezlak, Pawel Basek, and Pawel Kasprowski. 2022. Side Keyboard--the New Approach for Eye-typing. Procedia Computer Science, Vol. 207 (2022), 3348--3357.
[31]
Takahiro Hayashi and Reo Kishi. 2014. Workload Evaluation of Gaze-Writing Systems. Sensors and Materials, Vol. 26, 7 (2014), 529--538.
[32]
Anthony J Hornof and Anna Cavender. 2005. EyeDraw: enabling children with severe motor impairments to draw with their eyes. In Proceedings of the SIGCHI conference on Human factors in computing systems. 161--170.
[33]
Anke Huckauf and Mario H Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye tracking research & applications. 51--54.
[34]
Poika Isokoski. 2000. Text input methods for eye trackers using off-screen targets. In Proceedings of the 2000 symposium on Eye tracking research & applications. 15--21.
[35]
Howell Istance, Stephen Vickers, and Aulikki Hyrskykari. 2012. The validity of using non-representative users in gaze communication research. In Proceedings of the Symposium on Eye Tracking Research and Applications. 233--236.
[36]
Kenji Itoh, Hirotaka Aoki, and John Paulin Hansen. 2006. A comparative usability study of two Japanese gaze typing systems. In Proceedings of the 2006 symposium on Eye tracking research & applications. 59--66.
[37]
Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS), Vol. 9, 2 (1991), 152--169.
[38]
DV Jeevithashree, Kamalpreet Singh Saluja, and Pradipta Biswas. 2019. A case study of developing gaze controlled interface for users with severe speech and motor impairment. Technology and Disability, Vol. 31, 1--2 (2019), 63--76.
[39]
Shaun K Kane, Meredith Ringel Morris, Ann Paradiso, and Jon Campbell. 2017. " At times avuncular and cantankerous, with the reflexes of a mongoose" Understanding Self-Expression through Augmentative and Alternative Communication Devices. In Proceedings of the 2017 acm conference on computer supported cooperative work and social computing. 1166--1179.
[40]
Jari Kangas, Oleg vS pakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Feedback for smooth pursuit gaze tracking based control. In Proceedings of the 7th Augmented Human International Conference 2016. 1--8.
[41]
Jennifer Kent-Walsh and David Mcnaughton. 2005. Communication partner instruction in AAC: Present practices and future directions. Augmentative and alternative communication, Vol. 21, 3 (2005), 195--204.
[42]
Anwesha Khasnobish, Rahul Gavas, Debatri Chatterjee, Ved Raj, and Sapna Naitam. 2017. EyeAssist: A communication aid through gaze tracking for patients with neuro-motor disabilities. In 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops). IEEE, 382--387.
[43]
Andreas Komninos, Mark Dunlop, Kyriakos Katsaris, and John Garofalakis. 2018. A glimpse of mobile text entry errors and corrective behaviour in the wild. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct. 221--228.
[44]
Anne Köpsel, P"aivi Majaranta, Poika Isokoski, and Anke Huckauf. 2016. Effects of auditory, haptic and visual feedback on performing gestures by gaze or by hand. Behaviour & Information Technology, Vol. 35, 12 (2016), 1044--1062.
[45]
Kentaro Kotani, Yuji Yamaguchi, Takafumi Asao, and Ken Horii. 2010. Design of eye-typing interface using saccadic latency of eye movement. International Journal of Human-Computer Interaction, Vol. 26, 4 (2010), 361--376.
[46]
Per Ola Kristensson. 2009. Five challenges for intelligent text entry methods. AI Magazine, Vol. 30, 4 (2009), 85--85.
[47]
Per Ola Kristensson, James Lilley, Rolf Black, and Annalu Waller. 2020a. A design engineering approach for quantitatively exploring context-aware sentence retrieval for nonspeaking individuals with motor disabilities. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1--11.
[48]
Per Ola Kristensson, Morten Mjelde, and Keith Vertanen. 2023. Understanding Adoption Barriers to Dwell-Free Eye-Typing: Design Implications from a Qualitative Deployment Study and Computational Simulations. In Proceedings of the 28th International Conference on Intelligent User Interfaces. 607--620.
[49]
Per Ola Kristensson and Thomas Müllners. 2021. Design and analysis of intelligent text entry systems with function structure models and envelope analysis. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. 1--12.
[50]
Per Ola Kristensson and Keith Vertanen. 2012. The potential of dwell-free eye-typing for fast assistive gaze communication. In Proceedings of the symposium on eye tracking research and applications. 241--244.
[51]
Per Ola Kristensson, Keith Vertanen, and Morten Mjelde. 2018. Gaze based text input systems and methods. US Patent 10,082,864.
[52]
Per Ola Kristensson, Keith Vertanen, and Morten Mjelde. 2020b. Gaze based text input systems and methods. US Patent 10,551,915.
[53]
Per-Ola Kristensson and Shumin Zhai. 2004. SHARK2: a large vocabulary shorthand writing system for pen-based computers. In Proceedings of the 17th annual ACM symposium on User interface software and technology. 43--52.
[54]
Keiichiro Kubo, Satoru Shibata, Tomonori Karita, Tomonori Yamamoto, and Shenglin Mu. 2022. Eye-interface system using convolutional neural networks for people with physical disabilities. In 4th EAI International Conference on Robotic Sensor Networks. Springer, 81--91.
[55]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 chi conference on human factors in computing systems. 1952--1956.
[56]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos H Morimoto, and Margrit Betke. 2020. Swipe&switch: Text entry using gaze paths and context switching. In Adjunct Proceedings of the 33rd Annual ACM Symposium on User Interface Software and Technology. 84--86.
[57]
Janice Light and David McNaughton. 2014. Communicative competence for individuals who require augmentative and alternative communication: A new definition for a new era of communication?, bibinfonumpages18 pages.
[58]
Yi Liu, Bu-Sung Lee, and Martin J McKeown. 2016a. Robust eye-based dwell-free typing. International Journal of Human-Computer Interaction, Vol. 32, 9 (2016), 682--694.
[59]
Yi Liu, Bu Sung Lee, Andrzej Sluzek, Deepu Rajan, and Martin Mckeown. 2016b. Feasibility analysis of eye typing with a standard webcam. In Computer Vision--ECCV 2016 Workshops: Amsterdam, The Netherlands, October 8--10 and 15--16, 2016, Proceedings, Part II 14. Springer, 254--268.
[60]
Yi Liu, Chi Zhang, Chonho Lee, Bu-Sung Lee, and Alex Qiang Chen. 2015. Gazetry: Swipe text typing using gaze. In Proceedings of the annual meeting of the australian special interest group for computer human interaction. 192--196.
[61]
Otto Hans-Martin Lutz, Antje Christine Venjakob, and Stefan Ruff. 2015. SMOOVS: Towards calibration-free text entry by gaze using smooth pursuit movements. Journal of Eye Movement Research, Vol. 8, 1 (2015).
[62]
P"aivi Majaranta. 2012. Communication and text entry by gaze. In Gaze interaction and applications of eye tracking: Advances in assistive technologies. IGI Global, 63--77.
[63]
P"aivi Majaranta, Ulla-Kaija Ahola, and Oleg vS pakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 357--360.
[64]
P"aivi Majaranta, Anne Aula, and Kari-Jouko R"aih"a. 2004. Effects of feedback on eye typing with a short dwell time. In Proceedings of the 2004 symposium on Eye tracking research & applications. 139--146.
[65]
P"aivi Majaranta, Poika Isokoski, Jussi Rantala, Oleg vS pakov, Deepak Akkil, Jari Kangas, and Roope Raisamo. 2016. Haptic feedback in eye typing. Journal of Eye Movement Research, Vol. 9, 1 (2016).
[66]
P"aivi Majaranta, I Scott MacKenzie, Anne Aula, and Kari-Jouko R"aih"a. 2003. Auditory and visual feedback during eye typing. In CHI'03 Extended Abstracts on Human Factors in Computing Systems. 766--767.
[67]
P"aivi Majaranta and Kari-Jouko R"aih"a. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. 15--22.
[68]
Maria Laura Mele, Damon Millar, and Christiaan Erik Rijnders. 2015. Beyond direct gaze typing: A predictive graphic user interface for writing and communicating by gaze. In Human-Computer Interaction: Interaction Technologies: 17th International Conference, HCI International 2015, Los Angeles, CA, USA, August 2--7, 2015, Proceedings, Part II 17. Springer, 66--77.
[69]
Carlos H Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 symposium on eye-tracking research & applications. 271--274.
[70]
Martez E Mott, Shane Williams, Jacob O Wobbrock, and Meredith Ringel Morris. 2017. Improving dwell-based gaze typing with dynamic, cascading dwell times. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2558--2570.
[71]
Minh Hoa Nguyen, Thi Duyen Ngo, Nguyen Ba Hung, Can Van Mao, Hai-Dang Kieu, and Thanh Ha Le. 2023. On-screen keyboard controlled by gaze for Vietnamese people with amyotrophic lateral sclerosis. Technology and Disability Preprint (2023), 1--13.
[72]
Kohichi Ogata and Shuto Inoue. 2021. Effectiveness of an eye-gaze controlled typing system for the japanese syllabary. International Journal of Innovative Computing, Information and Control, Vol. 17, 2 (2021), 369--381.
[73]
Prateek Panwar, Sayan Sarcar, and Debasis Samanta. 2012. EyeBoard: A fast and accurate eye gaze-based text entry system. In 2012 4th International Conference on Intelligent Human Computer Interaction (IHCI). IEEE, 1--8.
[74]
Diogo Pedrosa, Maria da Gracc a Pimentel, and Khai N Truong. 2015. Filteryedping: A dwell-free eye typing technique. In Proceedings of the 33rd annual acm conference extended abstracts on human factors in computing systems. 303--306.
[75]
Jimin Pi, Paul A Koljonen, Yong Hu, and Bertram E Shi. 2020. Dynamic Bayesian adjustment of dwell time for faster eye typing. IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 28, 10 (2020), 2315--2324.
[76]
Jimin Pi and Bertram E Shi. 2019. Task-embedded online eye-tracker calibration for improving robustness to head motion. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1--9.
[77]
Patrik Pluchino, Valeria Orso, Giulia Bonamigo, and Luciano Gamberini. 2021. Influence of keyboard layout and feedback type in eye-typing tasks: a comparative study. In Proceedings of the 32nd European Conference on Cognitive Ergonomics. 1--7.
[78]
Marco Porta. 2015. A study on text entry methods based on eye gestures. Journal of Assistive Technologies, Vol. 9, 1 (2015), 48--67.
[79]
Marco Porta, Piercarlo Dondi, Alice Pianetta, and Virginio Cantoni. 2021. Speye: A calibration-free gaze-driven text entry technique based on smooth pursuit. IEEE Transactions on Human-Machine Systems, Vol. 52, 2 (2021), 312--323.
[80]
Marco Porta and Matteo Turina. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 symposium on Eye tracking research & applications. 27--34.
[81]
Kari-Jouko R"aih"a. 2015. Life in the fast lane: Effect of language and calibration accuracy on the speed of text entry by gaze. In Human-Computer Interaction--INTERACT 2015: 15th IFIP TC 13 International Conference, Bamberg, Germany, September 14--18, 2015, Proceedings, Part I 15. Springer, 402--417.
[82]
Kari-Jouko R"aih"a and Saila Ovaska. 2012. An exploratory study of eye typing fundamentals: dwell time, text entry rate, errors, and workload. In Proceedings of the SIGCHI conference on human factors in computing systems. 3001--3010.
[83]
Guilherme MA Ramos, Raiza Hanada, Maria da Gracc a C. Pimentel, and Cesar AC Teixeira. 2017. A word-prediction eye-typing approach for Brazilian Portuguese entries using geometric movements. In Proceedings of the 35th ACM international conference on the design of communication. 1--6.
[84]
Shyam Reyal, Shumin Zhai, and Per Ola Kristensson. 2015. Performance and user experience of touchscreen and gesture keyboards in a lab setting and in the wild. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems. 679--688.
[85]
MaryAnn Romski, Rose A Sevcik, Lauren B Adamson, Ashlyn Smith, Melissa Cheslock, and Roger Bakeman. 2011. Parent perceptions of the language development of toddlers with developmental delays before and after participation in parent-coached language interventions. (2011).
[86]
Javier San Agustin, Henrik Skovsgaard, Emilie Mollenbach, Maria Barret, Martin Tall, Dan Witzner Hansen, and John Paulin Hansen. 2010. Evaluation of a low-cost open-source gaze tracker. In Proceedings of the 2010 symposium on eye-tracking research & applications. 77--80.
[87]
Sayan Sarcar and Prateek Panwar. 2013. Eyeboard an enhanced eye gaze-based text entry system in Hindi. In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction. 354--363.
[88]
Sayan Sarcar, Prateek Panwar, and Tuhin Chakraborty. 2013. EyeK: an efficient dwell-free eye gaze-based text entry system. In Proceedings of the 11th asia pacific conference on computer human interaction. 215--220.
[89]
Andrew Sears and Vicki Hanson. 2011. Representing users in accessibility research. In Proceedings of the SIGCHI conference on Human factors in computing systems. 2235--2238.
[90]
Korok Sengupta, Raphael Menges, Chandan Kumar, and Steffen Staab. 2019. Impact of variable positioning of text prediction in gaze-based text entry. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. 1--9.
[91]
HM Simpson and Shirley M Hale. 1969. Pupillary changes during a decision-making task. Perceptual and Motor Skills, Vol. 29, 2 (1969), 495--498.
[92]
Henrik Skovsgaard, Julio C Mateo, and John Paulin Hansen. 2011. Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets. Behaviour & Information Technology, Vol. 30, 6 (2011), 821--831.
[93]
Oleg vS pakov and Darius Miniotas. 2004. On-line adjustment of dwell time for target selection by gaze. In Proceedings of the third Nordic conference on Human-computer interaction. 203--206.
[94]
Sarah M Swift. 2012. Low-tech, eye-movement-accessible AAC and typical adults. San Jose State University.
[95]
Mario H Urbina and Anke Huckauf. 2010. Alternatives to single character entry and dwell time selection on eye typing. In Proceedings of the 2010 symposium on eye-tracking research & applications. 315--322.
[96]
Stephanie Valencia, Richard Cave, Krystal Kallarackal, Katie Seaver, Michael Terry, and Shaun K Kane. 2023. "The less I type, the better": How AI Language Models can Enhance or Impede Communication for AAC Users. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 1--14.
[97]
Keith Vertanen and Per Ola Kristensson. 2011. The imagination of crowds: conversational AAC language modeling using crowdsourcing and large data sources. In Proceedings of the 2011 Conference on Empirical Methods in Natural Language Processing. 700--711.
[98]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing. 439--448.
[99]
Xiang Sheng Wang. 2014. Optimizing Gaze Typing with Pupil Size for People with Severe Motor Disabilities. Applied Mechanics and Materials, Vol. 644 (2014), 1330--1333.
[100]
David J Ward, Alan F Blackwell, and David JC MacKay. 2000. Dasher-a data entry interface using continuous gestures and language models. In Proceedings of the 13th annual ACM symposium on User interface software and technology. 129--137.
[101]
Jacob O Wobbrock, James Rubinstein, Michael W Sawyer, and Andrew T Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye tracking research & applications. 11--18.
[102]
Zhe Zeng, Elisabeth Sumithra Neuer, Matthias Roetting, and Felix Wilhelm Siebert. 2022. A One-Point Calibration Design for Hybrid Eye Typing Interface. International Journal of Human-Computer Interaction (2022), 1--14.
[103]
Zhe Zeng and Matthias Roetting. 2018. A text entry interface using smooth pursuit movements and language model. In Proceedings of the 2018 acm symposium on eye tracking research & applications. 1--2.
[104]
Shumin Zhai. 2003. What's in the Eyes for Attentive Input. Commun. ACM, Vol. 46, 3 (2003), 34--39.
[105]
Shumin Zhai and Per-Ola Kristensson. 2003. Shorthand writing on stylus keyboard. In Proceedings of the SIGCHI conference on Human factors in computing systems. 97--104.
[106]
Chi Zhang, Rui Yao, and Jinpeng Cai. 2018. Efficient eye typing with 9-direction gaze estimation. Multimedia Tools and Applications, Vol. 77 (2018), 19679--19696.
[107]
Xiaoyi Zhang, Harish Kulkarni, and Meredith Ringel Morris. 2017. Smartphone-based gaze gesture communication for people with motor disabilities. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 2878--2889.
[108]
Xianjun Sam Zheng, Stuart Goose, Joeri Kiekebosch, and James Jeng-Weei Lin. 2011. AVIN (Assisted visual interactive notepad): a novel interface design to expedite the eye writing experience. In Universal Access in Human-Computer Interaction. Users Diversity: 6th International Conference, UAHCI 2011, Held as Part of HCI International 2011, Orlando, FL, USA, July 9--14, 2011, Proceedings, Part II 6. Springer, 635--644.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the ACM on Human-Computer Interaction
Proceedings of the ACM on Human-Computer Interaction  Volume 8, Issue ETRA
ETRA
May 2024
351 pages
EISSN:2573-0142
DOI:10.1145/3669943
Issue’s Table of Contents
This work is licensed under a Creative Commons Attribution International 4.0 License.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 May 2024
Published in PACMHCI Volume 8, Issue ETRA

Check for updates

Author Tags

  1. aac
  2. eye typing
  3. gaze communication
  4. review

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 332
    Total Downloads
  • Downloads (Last 12 months)332
  • Downloads (Last 6 weeks)119
Reflects downloads up to 16 Oct 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Full Access

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media