Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Improving User Experience of Eye Tracking-Based Interaction: Introspecting and Adapting Interfaces

Published: 02 November 2019 Publication History

Abstract

Eye tracking systems have greatly improved in recent years, being a viable and affordable option as digital communication channel, especially for people lacking fine motor skills. Using eye tracking as an input method is challenging due to accuracy and ambiguity issues, and therefore research in eye gaze interaction is mainly focused on better pointing and typing methods. However, these methods eventually need to be assimilated to enable users to control application interfaces. A common approach to employ eye tracking for controlling application interfaces is to emulate mouse and keyboard functionality. We argue that the emulation approach incurs unnecessary interaction and visual overhead for users, aggravating the entire experience of gaze-based computer access. We discuss how the knowledge about the interface semantics can help reducing the interaction and visual overhead to improve the user experience. Thus, we propose the efficient introspection of interfaces to retrieve the interface semantics and adapt the interaction with eye gaze. We have developed a Web browser, GazeTheWeb, that introspects Web page interfaces and adapts both the browser interface and the interaction elements on Web pages for gaze input. In a summative lab study with 20 participants, GazeTheWeb allowed the participants to accomplish information search and browsing tasks significantly faster than an emulation approach. Additional feasibility tests of GazeTheWeb in lab and home environment showcase its effectiveness in accomplishing daily Web browsing activities and adapting large variety of modern Web pages to suffice the interaction for people with motor impairment.

References

[1]
Kiyohiko Abe, Kosuke Owada, Shoichi Ohi, and Minoru Ohyama. 2008. A system for Web browsing by eye-gaze input. Electronics and Communications in Japan 91, 5 (2008), 11--18.
[2]
Sheetal K. Agarwal, Anupam Jain, Arun Kumar, and Nitendra Rajput. 2010. The world wide telecom web browser. In Proceedings of the 1st ACM Symposium on Computing for Development (ACM DEV’10). ACM, New York, NY, Article 4, 9 pages.
[3]
Deepak Ahya and Daniel Baudino. 2006. Method to enhance user interface and target applications based on context awareness. US Patent App. 10/853,947.
[4]
Pierre A. Akiki, Arosha K. Bandara, and Yijun Yu. 2016. Engineering adaptive model-driven user interfaces. IEEE Transactions on Software Engineering 42, 12 (Dec. 2016), 1118--1147.
[5]
Michael Ashmore, Andrew T. Duchowski, and Garth Shoemaker. 2005. Efficient eye pointing with a fisheye lens. In Proceedings of Graphics Interface 2005 (GI’05). Canadian Human-Computer Communications Society, School of Computer Science, University of Waterloo, Waterloo, Ontario, Canada, 203--210. Retrieved from http://dl.acm.org/citation.cfm?id=1089508.1089542.
[6]
Richard Bates and Howell Istance. 2002. Zooming interfaces!: Enhancing the performance of eye controlled pointing devices. In Proceedings of the 5th International ACM Conference on Assistive Technologies (Assets’02). ACM, New York, NY, 119--126.
[7]
Wolfgang Beinhauer. 2006. A widget library for gaze-based interaction elements. In Proceedings of the 2006 Symposium on Eye Tracking Research 8 Applications (ETRA’06). ACM, New York, NY, 53--53.
[8]
Michael Bensch, Ahmed A. Karim, Jürgen Mellinger, Thilo Hinterberger, Michael Tangermann, Martin Bogdan, Wolfgang Rosenstiel, and Niels Birbaumer. 2007. Nessi: An EEG-controlled web browser for severely paralyzed patients. Computational Intelligence and Neuroscience 2007 (2007), 6:1--6:10.
[9]
Ralf Biedert, Georg Buscher, Sven Schwarz, Manuel Möller, Andreas Dengel, and Thomas Lottermann. 2010. The text 2.0 framework: Writing web-based gaze-controlled realtime applications quickly and easily. In Proceedings of the 2010 Workshop on Eye Gaze in Intelligent Human Machine Interaction (EGIHMI’10). ACM, New York, NY, 114--117.
[10]
Renaud Blanch, Yves Guiard, and Michel Beaudouin-Lafon. 2004. Semantic pointing: Improving target acquisition with control-display ratio adaptation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’04). ACM, New York, NY, 519--526.
[11]
John Brooke. 2013. SUS: A retrospective. Journal of Usability Studies 8, 2 (Feb. 2013), 29--40.
[12]
Brian Burg, Andrew J. Ko, and Michael D. Ernst. 2015. Explaining visual changes in web interfaces. In Proceedings of the 28th Annual ACM Symposium on User Interface Software Technology (UIST’15). ACM, New York, NY, 259--268.
[13]
George Candea, Mauricio Delgado, Michael Chen, and Armando Fox. 2003. Automatic failure-path inference: A generic introspection technique for internet applications. In Proceedings of the 3rd IEEE Workshop on Internet Applications (WIAPP’03). IEEE Computer Society, Washington, DC, 132--141. Retrieved from http://dl.acm.org/citation.cfm?id=832311.837386.
[14]
Ellis Carolyn. 1991. Sociological introspection and emotional experience. Symbolic Interaction 14, 1 (1991), 23--50.
[15]
Scott Carter, Amy Hurst, Jennifer Mankoff, and Jack Li. 2006. Dynamically adapting GUIs to diverse input devices. In Proceedings of the 8th International ACM SIGACCESS Conference on Computers and Accessibility (Assets’06). ACM, New York, NY, 63--70.
[16]
Chia-Hui Chang, Mohammed Kayed, Moheb R. Girgis, and Khaled F. Shaalan. 2006. A survey of web information extraction systems. IEEE Transactions on Knowledge and Data Engineering 18, 10 (2006), 1411--1428.
[17]
Zhaokang Chen and Bertram E. Shi. 2019. Using variable dwell time to accelerate gaze-based web browsing with two-step selection. International Journal of Human—Computer Interaction 35, 3 (2019), 240--255.
[18]
Albert M. Cook and Janice Miller Polgar. 2014. Assistive Technologies-E-Book: Principles and Practice. Elsevier Health Sciences.
[19]
Daniel K. Davies, Steven E. Stock, and Michael L. Wehmeyer. 2001. Enhancing independent internet access for individuals with mental retardation through use of a specialized web browser: A pilot study. Education and Training in Mental Retardation and Developmental Disabilities 36, 1 (2001), 107--113.
[20]
Vagner Figueredo de Santana, Rosimeire de Oliveira, Leonelo Dell Anhol Almeida, and Marcia Ito. 2013. Firefixia: An accessibility web browser customization toolbar for people with dyslexia. In Proceedings of the 10th International Cross-Disciplinary Conference on Web Accessibility (W4A’13). ACM, New York, NY, Article 16, 4 pages.
[21]
Antonio Diaz-Tula and Carlos H. Morimoto. 2016. AugKey: Increasing foveal throughput in eye typing with augmented keys. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI’16). ACM, New York, NY, 3533--3544.
[22]
Morgan Dixon and James Fogarty. 2010. Prefab: Implementing advanced behaviors using pixel-based reverse engineering of interface structure. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’10). ACM, New York, NY, 1525--1534.
[23]
Morgan Dixon, James Fogarty, and Jacob Wobbrock. 2012. A general-purpose target-aware pointing enhancement using pixel-level analysis of graphical interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, New York, NY, 3167--3176.
[24]
Tobii Dynavox. 2017. Photograph of computer system with Tobii eye tracker and running Tobii Windows Control software. Retrieved from http://www.tobiidynavox.de/wp-content/uploads/2016/06/TobiiDynavox_EyeMobileMini_front_-1030x687.png.
[25]
Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward everyday gaze input: Accuracy and precision of eye tracking and implications for design. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM, New York, NY, 1118--1130.
[26]
Leah Findlater, Alex Jansen, Kristen Shinohara, Morgan Dixon, Peter Kamb, Joshua Rakita, and Jacob O. Wobbrock. 2010. Enhanced area cursors: Reducing fine pointing demands for people with motor impairments. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology (UIST’10). ACM, New York, NY 153--162.
[27]
Sofia Fountoukidou, Jaap Ham, Uwe Matzat, and Cees Midden. 2018. Using an artificial agent as a behavior model to promote assistive technology acceptance. In Persuasive Technology. Jaap Ham, Evangelos Karapanos, Plinio P. Morita, and Catherine M. Burns (Eds.), Springer International Publishing, Cham, 285--296.
[28]
Krzysztof Z. Gajos. 2008. Automatically Generating Personalized User Interfaces. University of Washington.
[29]
Aryeh Gregor, Ms2ger, Alex Russell, Robin Berjon, and Anne van Kesteren. 2015. W3C DOM4. W3C Recommendation. W3C. Retrieved from http://www.w3.org/TR/2015/REC-dom-20151119/.
[30]
Tovi Grossman and Ravin Balakrishnan. 2005. The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor’s activation area. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’05). ACM, New York, NY, 281--290.
[31]
Human Performance Research Group. 1988. Nasa Task Load Index (TLX): Paper and Pencil Package. Retrieved May 2, 2016 from http://humansystems.arc.nasa.gov/groups/tlx/downloads/TLX_pappen_manual.pdf.
[32]
Vicki L. Hanson, Jonathan P. Brezin, Susan Crayne, Simeon Keates, Rick Kjeldsen, John T. Richards, Calvin Swart, and Shari Trewin. 2005. Improving web accessibility through an enhanced open-source browser. IBM Systems Journal 44, 3 (Aug. 2005), 573--588.
[33]
Katarzyna Harezlak, Pawel Kasprowski, and Mateusz Stasch. 2014. Towards accurate eye tracker calibration - methods and procedures. Procedia Computer Science 35 (2014), 1073--1081.
[34]
Simon Harper and Yeliz Yesilada. 2008. Web Accessibility: A Foundation for Research. Springer Science 8 Business Media.
[35]
Eric Horvitz. 1999. Principles of mixed-initiative user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’99). ACM, New York, NY159--166.
[36]
Visual Interactive. 2017. myGaze Power catalogue. Retrieved from http://www.mygaze.com/fileadmin/download/mygaze_power/myGaze_Power_catalogue.pdf.
[37]
Howell Istance, Richard Bates, Aulikki Hyrskykari, and Stephen Vickers. 2008. Snap clutch, a moded approach to solving the midas touch problem. In Proceedings of the 2008 Symposium on Eye Tracking Research 8 Applications (ETRA’08). ACM, New York, NY, 221--228.
[38]
Rob Jacob and Sophie Stellmach. 2016. What you look at is what you get: Gaze-based user interfaces. Interactions 23, 5 (Aug. 2016), 62--65.
[39]
Robert J. K. Jacob. 1990. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’90). ACM, New York, NY, 11--18.
[40]
Søren Staal Jensen and Tina Øvad. 2016. Optimizing web-accessibility for deaf people and the hearing impaired utilizing a sign language dictionary embedded in a browser. Cognition, Technology 8 Work 18, 4 (1 Nov. 2016), 717--731.
[41]
Fotis Kalaganis, Elisavet Chatzilari, Spiros Nikolopoulos, Yiannis Kompatsiaris, and Nikos Laskaris. 2018. An error-aware gaze-based keyboard by means of a hybrid BCI system. Scientific Reports 8, 1, Article 13176 (2018).
[42]
Ahmed A. Karim, Thilo Hinterberger, Jürgen Richter, Jürgen Mellinger, Nicola Neumann, Herta Flor, Andrea Kübler, and Niels Birbaumer. 2006. Neural internet: Web surfing with brain potentials for the completely paralyzed. Neurorehabilitation and Neural Repair 20, 4 (2006), 508--515.
[43]
Melanie Kellar, Carolyn Watters, and Michael Shepherd. 2006. The impact of task on the usage of web browser navigation mechanisms. In Proceedings of Graphics Interface 2006 (GI’06). Canadian Information Processing Society, Toronto, Ontario, Canada, 235--242. Retrieved from http://dl.acm.org/citation.cfm?id=1143079.1143118.
[44]
Kurt Koffka. 1924. Introspection and the method of psychology. British Journal of Psychology 15, 2 (1924), 149--161.
[45]
Chandan Kumar, Raphael Menges, Daniel Müller, and Steffen Staab. 2017. Chromium based framework to include gaze interaction in web browser. In Proceedings of the 26th International Conference on World Wide Web Companion (WWW’17 Companion). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, Switzerland, 219--223.
[46]
Chandan Kumar, Raphael Menges, and Steffen Staab. 2016. Eye-controlled interfaces for multimedia interaction. IEEE MultiMedia 23, 4 (Oct. 2016), 6--13.
[47]
Chandan Kumar, Raphael Menges, and Steffen Staab. 2017. Assessing the usability of gaze-adapted interface against conventional eye-based input emulation. In Proceedings of the IEEE 30th International Symposium on Computer-Based Medical Systems (CBMS’17). 793--798.
[48]
Manu Kumar, Tal Garfinkel, Dan Boneh, and Terry Winograd. 2007. Reducing shoulder-surfing by using gaze-based password entry. In Proceedings of the 3rd Symposium on Usable Privacy and Security (SOUPS’07). ACM, New York, NY, 13--19.
[49]
Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, and Andreas Paepcke. 2008. Improving the accuracy of gaze input for interaction. In Proceedings of the 2008 Symposium on Eye Tracking Research 8 Applications (ETRA’08). ACM, New York, NY65--68.
[50]
Manu Kumar, Andreas Paepcke, Terry Winograd, and Terry Winograd. 2007. EyePoint: Practical pointing and selection using gaze and keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’07). ACM, New York, NY, 421--430.
[51]
Manu Kumar and Terry Winograd. 2007. GUIDe: Gaze-enhanced UI design. In Proceedings of the CHI’07 Extended Abstracts on Human Factors in Computing Systems (CHI EA’07). ACM, New York, NY, 1977--1982.
[52]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI’16). ACM, New York, NY, 1952--1956.
[53]
Fabrizio Lamberti, Gianluca Paravati, Valentina Gatteschi, and Alberto Cannavo. 2017. Supporting web analytics by aggregating user interaction data from heterogeneous devices using viewport-dom-based heat maps. IEEE Transactions on Industrial Informatics 13, 4 (2017), 1989--1999.
[54]
Chris Lankford. 2000. Effective eye-gaze input into windows. In Proceedings of the 2000 Symposium on Eye Tracking Research 8 Applications (ETRA’00). ACM, New York, NY, 23--27.
[55]
Christof Lutteroth, Moiz Penkar, and Gerald Weber. 2015. Gaze vs. mouse: A fast and accurate gaze-only click alternative. In Proceedings of the 28th Annual ACM Symposium on User Interface Software; Technology (UIST’15). ACM, New York, NY, 385--394.
[56]
Yu-Seung Ma, Jeff Offutt, and Yong-Rae Kwon. 2006. MuJava: A mutation system for Java. In Proceedings of the 28th International Conference on Software Engineering (ICSE’06). ACM, New York, NY, 827--830.
[57]
I. Scott MacKenzie. 1992. Fitts’ law as a research and design tool in human-computer interaction. Human-Computer Interaction 7, 1 (Mar. 1992), 91--139.
[58]
I. Scott MacKenzie. 2012. Evaluating eye tracking systems for computer input. In Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Global, 205--225.
[59]
Jalal U. Mahmud, Yevgen Borodin, and I. V. Ramakrishnan. 2007. Csurf: A context-driven non-visual web-browser. In Proceedings of the 16th International Conference on World Wide Web (WWW’07). ACM, New York, NY, 31--40.
[60]
Päivi Majaranta. 2009. Text Entry by Eye Gaze. University of Tampere.
[61]
Päivi Majaranta, Hirotaka Aoki, Mick Donegan, Dan Witzner Hansen, and John Paulin Hansen. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies (1st ed.). IGI Global, Hershey, PA.
[62]
Jennifer Mankoff, Anind Dey, Udit Batra, and Melody Moore. 2002. Web accessibility for low bandwidth input. In Proceedings of the 5th International ACM Conference on Assistive Technologies (Assets’02). ACM, New York, NY, 17--24.
[63]
Raphael Menges, Chandan Kumar, Daniel Müller, and Korok Sengupta. 2017. GazeTheWeb: A gaze-controlled web browser. In Proceedings of the 14th Web for All Conference (W4A’17). ACM.
[64]
Raphael Menges, Chandan Kumar, Korok Sengupta, and Steffen Staab. 2016. eyeGUI: A novel framework for eye-controlled user interfaces. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI’16). ACM, New York, NY, Article 121, 6 pages.
[65]
Raphael Menges, Chandan Kumar, Ulrich Wechselberger, Christoph Schaefer, Tina Walber, and Steffen Staab. 2017. Schau genau! A gaze-controlled 3D game for entertainment and education. Journal of Eye Movement Research 10 (2017), 220.
[66]
Raphael Menges, Hanadi Tamimi, Chandan Kumar, Tina Walber, Christoph Schaefer, and Steffen Staab. 2018. Enhanced representation of web pages for usability analysis with eye tracking. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research 8 Applications (ETRA’18). ACM, New York, NY, Article 18, 9 pages.
[67]
Jakob Nielsen and Rolf Molich. 1990. Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’90). ACM, New York, NY, 249--256.
[68]
Spiros Nikolopoulos, Panagiotis C. Petrantonakis, Kostas Georgiadis, Fotis Kalaganis, Georgios Liaros, Ioulietta Lazarou, Katerina Adam, Anastasios Papazoglou-Chalikias, Elisavet Chatzilari, Vangelis P. Oikonomou, Chandan Kumar, Raphael Menges, Steffen Staab, Daniel Müller, Korok Sengupta, Sevasti Bostantjopoulou, Zoe Katsarou, Gabi Zeilig, Meir Plotnik, Amihai Gotlieb, Racheli Kizoni, Sofia Fountoukidou, Jaap Ham, Dimitrios Athanasiou, Agnes Mariakaki, Dario Comanducci, Edoardo Sabatini, Walter Nistico, Markus Plank, and Ioannis Kompatsiaris. 2017. A multimodal dataset for authoring and editing multimedia content: The MAMEM project. Data in Brief 15 (2017), 1048--1056.
[69]
Dan R. Olsen, Jr., Sean Jefferies, Travis Nielsen, William Moyes, and Paul Fredrickson. 2000. Cross-modal interaction using XWeb. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (UIST’00). ACM, New York, NY, 191--200.
[70]
Marco Porta and Alessia Ravelli. 2009. WeyeB, an eye-controlled web browser for hands-free navigation. In Proceedings of the 2nd Conference on Human System Interactions (HSI’09). IEEE Press, Piscataway, NJ, 207--212. Retrieved from http://dl.acm.org/citation.cfm?id=1689359.1689396.
[71]
Simon Schenk, Marc Dreiser, Gerhard Rigoll, and Michael Dorr. 2017. GazeEverywhere: Enabling gaze-only user interaction on an unmodified desktop PC in everyday scenarios. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM, New York, NY, 3034--3044.
[72]
Korok Sengupta, Min Ke, Raphael Menges, Chandan Kumar, and Steffen Staab. 2018. Hands-free web browsing: Enriching the user experience with gaze and voice modality. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research 8 Applications (ETRA’18). ACM, New York, NY, Article 88, 3 pages.
[73]
Korok Sengupta, Chandan Kumar, and Steffen Staab. 2017. Usability heuristics for eye-controlled user interfaces. In Proceedings of the 19th European Conference on Eye Movements. Retrieved from http://cogain2017.cogain.org/camready/poster3-Sengupta.pdf.
[74]
Korok Sengupta, Raphael Menges, Chandan Kumar, and Steffen Staab. 2017. GazeTheKey: Interactive keys to integrate word predictions for gaze-based text entry. In Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion (IUI’17 Companion). ACM, New York, NY, 121--124.
[75]
Korok Sengupta, Jun Sun, Raphael Menges, Chandan Kumar, and Steffen Staab. 2017. Analyzing the impact of cognitive load in evaluating gaze-based typing. In Proceedings of the 30th IEEE International Symposium on Computer-Based Medical Systems. IEEE.
[76]
Ben Shneiderman. 1997. Designing the User Interface: Strategies for Effective Human-Computer Interaction (3rd ed.). Addison-Wesley Longman Publishing Co., Inc., Boston, MA.
[77]
Laurianne Sitbon, Oscar Wong, and Margot Brereton. 2014. Efficient web browsing with a single-switch. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: The Future of Design (OzCHI’14). ACM, New York, NY, 515--518.
[78]
Jiguo Song and Gabriel Parmer. 2013. Toward predictable, efficient, system-level tolerance of transient faults. SIGBED Review 10, 4 (Dec. 2013), 53--56.
[79]
Wolfgang Stuerzlinger, Olivier Chapuis, Dusty Phillips, and Nicolas Roussel. 2006. User interface Façades: Towards fully adaptable user interfaces. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology (UIST’06). ACM, New York, NY, 309--318.
[80]
Julius Sweetland. 2016. Optikey: Type, Click, Speak. https://github.com/OptiKey/OptiKey.
[81]
Jingtao Wang and Jennifer Mankoff. 2002. Theoretical and architectural support for input device adaptation. In Proceedings of the 2003 Conference on Universal Usability (CUU’03). ACM, 85--92.
[82]
Benjamin Wassermann, Adrian Hardt, and Gottfried Zimmermann. 2012. Generic gaze interaction events for web browsers using the eye tracker as input device. In Proceedings of the 2012 Workshop on Emerging Web Technologies at Conference on World Wide Web.
[83]
Tim Weninger, Rodrigo Palacios, Valter Crescenzi, Thomas Gottron, and Paolo Merialdo. 2016. Web content extraction: A metaanalysis of its past and thoughts on its future. SIGKDD Explorations Newsletter 17, 2 (Feb. 2016), 17--23.
[84]
Xuebai Zhang, Xiaolong Liu, Shyan-Ming Yuan, and Shu-Fan Lin. 2017. Eye tracking based control system for natural human-computer interaction. Computational Intelligence and Neuroscience 2017, Article 5739301 (2017), 9 pages.
[85]
Xuan Zhang and I. Scott MacKenzie. 2007. Evaluating eye tracking with ISO 9241 - Part 9. In Proceedings of the 12th International Conference on Human-computer Interaction: Intelligent Multimodal Interaction Environments (HCI’07). Springer-Verlag, Berlin, 779--788. Retrieved from http://dl.acm.org/citation.cfm?id=1769590.1769678.

Cited By

View all
  • (2025)Adaptive Real-Time Translation Assistance Through Eye-TrackingAI10.3390/ai60100056:1(5)Online publication date: 2-Jan-2025
  • (2025)Selection of optimal display color for China’s emergency management system using eye trackingDisplays10.1016/j.displa.2024.10288887(102888)Online publication date: Apr-2025
  • (2024)Designing User Experience Improvement and User Behavior Pattern Recognition Algorithms in Design OperationJournal of Machine and Computing10.53759/7669/jmc202404094(1009-1017)Online publication date: 5-Oct-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Computer-Human Interaction
ACM Transactions on Computer-Human Interaction  Volume 26, Issue 6
December 2019
230 pages
ISSN:1073-0516
EISSN:1557-7325
DOI:10.1145/3371148
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 November 2019
Accepted: 01 May 2019
Revised: 01 May 2019
Received: 01 August 2018
Published in TOCHI Volume 26, Issue 6

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Eye tracking
  2. GazeTheWeb
  3. Web accessibility
  4. gaze interaction experience
  5. gaze-based emulation
  6. gaze-controlled interface
  7. interface semantics
  8. introspection

Qualifiers

  • Research-article
  • Research
  • Refereed

Funding Sources

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)229
  • Downloads (Last 6 weeks)38
Reflects downloads up to 01 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Adaptive Real-Time Translation Assistance Through Eye-TrackingAI10.3390/ai60100056:1(5)Online publication date: 2-Jan-2025
  • (2025)Selection of optimal display color for China’s emergency management system using eye trackingDisplays10.1016/j.displa.2024.10288887(102888)Online publication date: Apr-2025
  • (2024)Designing User Experience Improvement and User Behavior Pattern Recognition Algorithms in Design OperationJournal of Machine and Computing10.53759/7669/jmc202404094(1009-1017)Online publication date: 5-Oct-2024
  • (2024)Gaze-Data-Based Probability Inference for Menu Item Position Effect on Information SearchJournal of Advanced Computational Intelligence and Intelligent Informatics10.20965/jaciii.2024.p030328:2(303-315)Online publication date: 20-Mar-2024
  • (2024)Remapping the Document Object Model using Geometric and Hierarchical Data Structures for Efficient Eye ControlProceedings of the ACM on Human-Computer Interaction10.1145/36556088:ETRA(1-16)Online publication date: 28-May-2024
  • (2024)A Functional Usability Analysis of Appearance-Based Gaze Tracking for AccessibilityProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656363(1-7)Online publication date: 4-Jun-2024
  • (2024)Eye-Hand Typing: Eye Gaze Assisted Finger Typing via Bayesian Processes in ARIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2024.337210630:5(2496-2506)Online publication date: 19-Mar-2024
  • (2023)FreeGaze: A Framework for 3D Gaze Estimation Using Appearance Cues from a Facial VideoSensors10.3390/s2323960423:23(9604)Online publication date: 4-Dec-2023
  • (2023)Gaze Speedup: Eye Gaze Assisted Gesture Typing in Virtual RealityProceedings of the 28th International Conference on Intelligent User Interfaces10.1145/3581641.3584072(595-606)Online publication date: 27-Mar-2023
  • (2023)Improving and Analyzing Sketchy High-Fidelity Free-Eye DrawingProceedings of the 2023 ACM Designing Interactive Systems Conference10.1145/3563657.3596121(856-870)Online publication date: 10-Jul-2023
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media