Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3206343.3206347acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Context switching eye typing using dynamic expanding targets

Published: 15 June 2018 Publication History

Abstract

Text entry by gazing on a virtual keyboard (also known as eye typing) is an important component of any gaze communication system. One of the main challenges for efficient communication is how to avoid unintended key selections due to the Midas' touch problem. The most common selection technique by gaze is dwelling. Though easy to learn, long dwell-times slows down the communication, and short dwells are prone to error. Context switching (CS) is a faster and more comfortable alternative, but the duplication of contexts takes a lot of screen space. In this paper we introduce two new CS designs using dynamic expanding targets that are more appropriate when a reduced interaction window is required. We compare the performance of the two new designs with the original CS design using QWERTY layouts as contexts. Our results with 6 participants typing with the 3 keyboards show that the use of smaller size layouts with dynamic expanding targets are as accurate and comfortable as the larger QWERTY layout, though providing lower typing speeds.

References

[1]
Ahmed Sabbir Arif and Wolfgang Stuerzlinger. 2009. Analysis of text entry performance metrics. In 2009 IEEE Toronto International Conference Science and Technology for Humanity (TIC-STH). IEEE, 100--105.
[2]
Michael Ashmore, Andrew T. Duchowski, and Garth Shoemaker. 2005. Efficient Eye Pointing with a Fisheye Lens. In Proceedings of Graphics Interface 2005 (GI '05). Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario, Canada, 203--210. http://dl.acm.org/citation.cfm?id=1089508.1089542
[3]
Tuhin Chakraborty, Sayan Sarcar, and Debasis Samanta. 2014. Design and Evaluation of a Dwell-free Eye Typing Technique. In Proceedings of the Extended Abstracts of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI EA '14). ACM, New York, NY, USA, 1573--1578.
[4]
A. Diaz-Tula and C.H. Morimoto. 2017. Robust, real-time eye movement classification for gaze interaction using finite state machines. In Electronic Proceedings of the 2017 COGAIN Symposium. The COGAIN Association, Wuppertal, Germany.
[5]
Dan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 205--212.
[6]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: towards a universal input for various applications. In Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 51--54.
[7]
Robert J. K. Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. In Proceedings of the SIGCHI conference on Human factors in computing systems: Empowering people (CHI '90). ACM, New York, NY, USA, 11--18.
[8]
Per Ola Kristensson and Keith Vertanen. 2012. The Potential of Dwell-free Eye-typing for Fast Assistive Gaze Communication. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 241--244.
[9]
Chandan Kumar, Raphael Menges, Daniel Müller, and Steffen Staab. 2017. Chromium Based Framework to Include Gaze Interaction in Web Browser. In Proceedings of the 26th International Conference on World Wide Web Companion (WWW '17 Companion). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 219--223.
[10]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free Text Entry Using Gaze Paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 1952--1956.
[11]
I. Scott MacKenzie and R. William Soukoreff. 2003. Phrase Sets for Evaluating Text Entry Techniques. In CHI '03 Extended Abstracts on Human Factors in Computing Systems (CHI EA '03). ACM, New York, NY, USA, 754--755.
[12]
Päivi Majaranta, Ulla-Kaija Ahola, and Oleg Špakov. 2009. Fast gaze typing with an adjustable dwell time. In Proceedings of the 27th international conference on Human factors in computing systems (CHI '09). ACM, New York, NY, USA, 357--360.
[13]
M. McGuffin and R. Balakrishnan. 2002. Acquisition of expanding targets. In Proc. CHI 2002. ACM Press, New York, NY, USA, 57 -- 64.
[14]
Raphael Menges, Christoph Schaefer, Chandan Kumar, Tina Walber, Ulrich Wechselberger, and Steffen Staab. 2017. Schau genau! A Gaze-Controlled 3D Game for Entertainment and Education. In Electronic Proceedings of the 2017 COGAIN Symposium. The COGAIN Association, Wuppertal, Germany, 1--3.
[15]
Darius Miniotas, Oleg Spakov, and I. Scott MacKenzie. 2004. Eye Gaze Interaction with Expanding Targets. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04). ACM, New York, NY, USA, 1255--1258.
[16]
Carlos H. Morimoto and Arnon Amir. 2010. Context switching for fast key selection in text entry applications. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, New York, NY, USA, 271--274.
[17]
Diogo Pedrosa, Maria Da Graça Pimentel, Amy Wright, and Khai N. Truong. 2015. Filteryedping: Design Challenges and User Performance of Dwell-Free Eye Typing. ACM Trans. Access. Comput. 6, 1, Article 3 (March 2015), 37 pages.
[18]
Marco Porta and Matteo Turina. 2008. Eye-S: a full-screen input modality for pure eye-based communication. In Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 27--34.
[19]
R. William Soukoreff and I. Scott MacKenzie. 2001. Measuring Errors in Text Entry Tasks: An Application of the Levenshtein String Distance Statistic. In CHI '01 Extended Abstracts on Human Factors in Computing Systems (CHI EA '01). ACM, New York, NY, USA, 319--320.
[20]
R. William Soukoreff and I. Scott MacKenzie. 2003. Metrics for Text Entry Research: An Evaluation of MSD and KSPC, and a New Unified Error Metric. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '03). ACM, New York, NY, USA, 113--120.
[21]
O. Spakov and P. Majaranta. 2008. Scrollable Keyboards for Eye Typing. In Proceedings of the 2008 COGAIN Symposium. Prague, Czech Republic, 63--66.
[22]
Oleg Spakov and Darius Miniotas. 2005. Gaze-based Selection of Standard-size Menu Items. In Proceedings of the 7th International Conference on Multimodal Interfaces (ICMI '05). ACM, New York, NY, USA, 124--128.
[23]
Antonio Diaz Tula, Filipe M. S. de Campos, and Carlos H. Morimoto. 2012. Dynamic Context Switching for Gaze Based Interaction. In Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA '12). ACM, New York, NY, USA, 353--356.
[24]
Antonio Diaz Tula and Carlos H. Morimoto. 2016. AugKey: Increasing Foveal Through-put in Eye Typing with Augmented Keys. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 3533--3544.
[25]
David J. Ward, Alan F. Blackwell, and David J. C. MacKay. 2000. Dasher---a Data Entry Interface Using Continuous Gestures and Language Models. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (UIST '00). ACM, New York, NY, USA, 129--137.
[26]
Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, and Andrew T. Duchowski. 2008. Longitudinal evaluation of discrete consecutive gaze gestures for text entry. In Proceedings of the 2008 symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 11--18.
[27]
S. Zhai, S. Conversy, M. Beaudouin-Lafon, and Y. Guiard. 2003. Human On-line Response to Target Expansion. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. CHI '03. ACM Press, New York, NY, USA, 177 -- 184.

Cited By

View all

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
COGAIN '18: Proceedings of the Workshop on Communication by Gaze Interaction
June 2018
69 pages
ISBN:9781450357906
DOI:10.1145/3206343
  • General Chairs:
  • Carlos Morimoto,
  • Thies Pfeiffer
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. context switching
  2. expanding targets
  3. eye tracking
  4. eyetyping
  5. gaze interaction

Qualifiers

  • Research-article

Funding Sources

Conference

ETRA '18

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)6
Reflects downloads up to 13 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
  • (2024)The impact of visual and motor space size on gaze-based target selectionAustralian Journal of Psychology10.1080/00049530.2024.230938476:1Online publication date: 5-Feb-2024
  • (2024)A study on the letter arrangement influence on eye typing efficiencyProcedia Computer Science10.1016/j.procs.2024.09.162246(3889-3897)Online publication date: 2024
  • (2024)The Guided Evaluation MethodInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2024.103317190:COnline publication date: 1-Oct-2024
  • (2023)Pactolo Bar: An Approach to Mitigate the Midas Touch Problem in Non-Conventional InteractionSensors10.3390/s2304211023:4(2110)Online publication date: 13-Feb-2023
  • (2023)Does Repeatedly Typing the Same Phrase Provide a Good Estimate of Expert Text Entry Performance?Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585647(1-8)Online publication date: 19-Apr-2023
  • (2022)Performance Analysis of Saccades for Primary and Confirmatory Target SelectionProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565619(1-12)Online publication date: 29-Nov-2022
  • (2022)Methodological Standards in Accessibility Research on Motor Impairments: A SurveyACM Computing Surveys10.1145/354350955:7(1-35)Online publication date: 15-Dec-2022
  • (2022)EyeLikert: Eye-based Interactions for Answering Surveys2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529776(1-3)Online publication date: 8-Jun-2022
  • (2022)Side Keyboard – the New Approach for Eye-typingProcedia Computer Science10.1016/j.procs.2022.09.393207:C(3348-3357)Online publication date: 1-Jan-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media