Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2659532.2659592acmotherconferencesArticle/Chapter ViewAbstractPublication PagescompsystechConference Proceedingsconference-collections
research-article

Eye tracking as a computer input and interaction method

Published: 27 June 2014 Publication History

Abstract

Eye tracking applications can be considered under two points of view: in the former the eye tracker is a passive sensor that monitors the eyes to determine what the user is watching. In the latter the eye tracker has an active role that allows the user to control a computer. As a computer input device, an eye tracker typically substitutes the mouse point-select operation with a look-select process to: press buttons, select icons, follow links, etc. While look-select operations are naturally suited to eye input, controlling an interface element is not, because the eyes move covertly by saccades -- quick movements of the point of gaze from one location to another. Since the main task of the eyes is simply to see, if they are also used for interacting with the computer it may be difficult to decide, for example, whether a button is watched to understand its function or to trigger the associated action. In general, eye tracking systems present significant challenges when used for computer input and much research has been carried out in this field.

References

[1]
Aks, D. J., Zelinsky, G. J., Sprott, J. C.: Memory across eye-movements: 1=f dynamic in visual search, J. Nonlinear Dynamics, Psychology, and Life Sciences, Vol. 6, No. 1, 2002, 1--25.
[2]
Aks, D. J. V.: 1/f dynamic in complex visual search: Evidence for Self-Organized Criticality in human perception. In Tutorials in contemporary nonlinear methods for the behavioral sciences, Riley M. A., G. C. Van Orden (Eds.), Web Book, 2005, 326--359.
[3]
Böhme, M., Meyer, A., Martinetz, T., Barth, E.: Remote Eye Tracking: State of the Art and Directions for Future Development. In Proceedings of the 2nd Conference on Communication by Gaze Interaction (COGAIN), "Gazing into the Future", 2006, 12--16.
[4]
Cantoni, V., Caputo, G., Lombardi, L.: Attentional Engagement in Vision Systems, in Artificial Vision, Cantoni, V., Levialdi, S., Roberto, V. Eds., London: Academic Press, 1996, 3--42.
[5]
V. Cantoni, C. Galdi, M. Nappi, M. Porta, D. Riccio, GANT: Gaze ANalysis Technique for Human Identification. Pattern Recognition, 2014 (available online 13 March 2014).
[6]
De Gaudenzi, E., Porta, M.: Gaze Input for Ordinary Interfaces: Combining Automatic and Manual Error Correction Techniques to Improve Pointing Precision. In Intelligent Systems for Science and Information, Studies in Computational Intelligence, 542, Chen, L., Kapoor, S., Bhatia, R. (Eds.), Springer International Publishing Switzerland, 2014, 197--212.
[7]
Desolneux, A., Moisan, L., Morel, J. M.: From Gestalt Theory to Image Analysis: A Probabilistic Approach. Interdisciplinary Applied Mathematics Series, Vol. 34, Springer, 2008, 11--25.
[8]
Duchowski, A. T., Eye Tracking Methodology -- Theory and Practice (2nd Ed.). London: Springer-Verlag, 2007.
[9]
Eisenberg, A.: Pointing With Your Eyes, to Give the Mouse a Break, New York Times, 2011 (http://www.nytimes.com/2011/03/27/business/27novel.html?r=0, retrieved on 06/06/2014).
[10]
Encyclopedia Britannica, Human Eye (retrieved from http://www.britannica.com/EBchecked/topic/1688997/human-eye on 15 June 2014).
[11]
Fitts, P. M.: The information capacity of the human motor system in controlling the amplitude of movement, Journal of Experimental Psychology, 47, 1954, 381--391.
[12]
C. Galdi, M. Nappi, D. Riccio, V. Cantoni, M. Porta, A New Gaze Analysis Based Soft-Biometric. Proceedings of the 5th Mexican Conference on Pattern Recognition (MCPR 2013), Querétaro, Mexico, 2013, 136--144.
[13]
Gonzalez, R. C., Woods R. E.: Digital Image Processing (3rd Edition). Prentice Hall, 2008.
[14]
Grossman, T. and Balakrishnan, R.: The bubble cursor: Enhancing target acquisition by dynamic resizing of the cursor's activation area. Proceedings of the ACM Conference on Human Factors in Computing Systems - CHI 2005, NY: ACM, 2005, 281--290.
[15]
Hayhoe, M. M., Droll, J.: Mennie, N., Learning Where to Look, in "Eye Movements: A Window on Mind and Brain", ddited by Van Gompel, R. P. G., Fischer, M. H., Murray, W. S., Hill, R. L., Elsevier Ltd., 2007, 641--659.
[16]
Heiko Drewes: Eye Gaze Tracking for Human Computer Interaction, PhD Thesis, Ludwig-Maximilians-Universität, München, 2010.
[17]
Hoffman, J. E.: Visual attention and eye movements. In Attention, H. Pashler (Ed.). Hove, UK: Psychology Press, 1998, 119--154.
[18]
Horowitz, T. S., Wolfe, J. M., Visual Search has no memory. Nature, 6, 394(6693), 1998, 575--577.
[19]
Jacob, R. J. K., Karn, K. S.: Eye tracking in Human-Computer Interaction and usability research: Ready to deliver the promises. In The mind's eye: Cognitive and applied aspects of eye movement research. Hyönä, J., Radach, R., Deubel, H. (Eds.). Amsterdam: Elsevier, 2003, 573--605.
[20]
Just, M. A., Carpenter, P. A.: Eye Fixations and Cognitive Processes, Cognitive Psychology, 88, 1976, 441--480.
[21]
Just, M. A., Carpenter, P. A.: A theory of Reading. From Eye Fixation to Comprehension, Psychology Review, 87, 1980, 329--354.
[22]
Kaur, M., Tremaine, M., Huang, N., Wilder, J., Gacovski, Z., Flippo, F., Mantravadi, C. S.: Where is "it"? Event synchronization in gaze-speech input systems. Proceedings of the Fifth International Conference on Multimodal Interfaces NY: ACM Press, 2003, 151--158.
[23]
MacKenzie, I. S.: Evaluating eye tracking systems for computer input. In Gaze interaction and applications of eye tracking: Advances in assistive technologies, Majaranta, P., Aoki, H., Donegan, M., Hansen, D. W., Hansen, J. P., Hyrskykari, A., Räihä, K.-J. (Eds.), Hershey, PA: IGI Global, 2012, 205--225.
[24]
Miniotas, D., Špakov, O., and MacKenzie, I. S.: Eye gaze interaction with expanding targets, Extended Abstracts of the ACM Conference on Human Factors in Computing Systems - CHI 2004, NY: ACM, 2004, 1255--1258.
[25]
Poole, A., Ball, L. J.: Eye Tracking in Human-Computer Interaction and Usability Research: Current Status and Future. In Encyclopedia of Human-Computer Interaction, Ghaoui C. (Ed.), PA: Idea Group, Inc., 2005, 211--219.
[26]
Porta, M.: Vision-Based User Interfaces: Methods and Applications. International Journal of Human-Computer Studies, 57, 2002, 27--73.
[27]
Porta, M., Turina, M. Eye-S: a Full-Screen Input Modality for Pure Eye-based Communication. In Proceedings of the 5th symposium on Eye Tracking Research & Applications (ETRA 2008), Savannah, GA, USA, March 26--28, 2008, 27--34.
[28]
Porta, M., Ravelli, A.: WeyeB, an Eye-Controlled Web Browser for Hands-Free Navigation. In Proceedings of the 2nd IEEE-IES International Conference on Human System Interaction (HSI 2009), Catania, Italy, 2009, pp. 210--215.
[29]
Porta, M., Ravarelli, A., Spagnoli, G.: ceCursor, a Contextual Eye Cursor for General Pointing in Windows Environments. Proceedings of the 6th Eye Tracking Research & Applications Symposium (ETRA 2010), Austin, TX, USA, ACM Press, 2010, 331--337.
[30]
Tatler, B. W., Wade, N. J., Kwan, H., Findlay, J. M., Velichkovsky, B. M.: Yarbus, eye movements, and vision, i-Perception, Vol. 1, 2010, pp. 7--27.
[31]
Treisman A., Gelade G.: A feature-integration theory of attention, Cognitive Psychology, Vol. 12, 1980, 97--136.
[32]
Yarbus, A. L., Eye Movements and Vision. NY: Plenum Press, 1967.

Cited By

View all
  • (2024)Digital Intelligences and Urban InfoSystems in Territorial Re-EducationEncyclopedia of Information Science and Technology, Sixth Edition10.4018/978-1-6684-7366-5.ch069(1-36)Online publication date: 1-Jul-2024
  • (2023)Brain-Metaverse Interaction for Anxiety Regulation2023 9th International Conference on Virtual Reality (ICVR)10.1109/ICVR57957.2023.10169785(385-392)Online publication date: 12-May-2023
  • (2022)Eye Movement in User Experience and Human–Computer Interaction ResearchEye Tracking10.1007/978-1-0716-2391-6_10(165-183)Online publication date: 21-Jun-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
CompSysTech '14: Proceedings of the 15th International Conference on Computer Systems and Technologies
June 2014
489 pages
ISBN:9781450327534
DOI:10.1145/2659532
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

  • UORB: University of Ruse, Bulgaria
  • Querbie: Querbie

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 June 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. eye tracking
  2. human computer interaction
  3. pointing devices
  4. visual attention

Qualifiers

  • Research-article

Conference

CompSysTech'14
Sponsor:
  • UORB
  • Querbie

Acceptance Rates

CompSysTech '14 Paper Acceptance Rate 56 of 107 submissions, 52%;
Overall Acceptance Rate 241 of 492 submissions, 49%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)25
  • Downloads (Last 6 weeks)3
Reflects downloads up to 01 Nov 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Digital Intelligences and Urban InfoSystems in Territorial Re-EducationEncyclopedia of Information Science and Technology, Sixth Edition10.4018/978-1-6684-7366-5.ch069(1-36)Online publication date: 1-Jul-2024
  • (2023)Brain-Metaverse Interaction for Anxiety Regulation2023 9th International Conference on Virtual Reality (ICVR)10.1109/ICVR57957.2023.10169785(385-392)Online publication date: 12-May-2023
  • (2022)Eye Movement in User Experience and Human–Computer Interaction ResearchEye Tracking10.1007/978-1-0716-2391-6_10(165-183)Online publication date: 21-Jun-2022
  • (2019)Identification of fixations, saccades and smooth pursuits based on segmentation and clusteringIntelligent Data Analysis10.3233/IDA-18418423:5(1041-1054)Online publication date: 24-Oct-2019
  • (2019)Recognition of Eye Movements Based on EEG Signals and the SAX AlgorithmIntelligent and Interactive Computing10.1007/978-981-13-6031-2_38(237-247)Online publication date: 17-May-2019
  • (2018)Eye Movement Analysis in BiometricsBiometrics under Biomedical Considerations10.1007/978-981-13-1144-4_8(171-183)Online publication date: 14-Dec-2018
  • (2016)Eye movement analysis for human authenticationPattern Recognition Letters10.1016/j.patrec.2016.11.00284:C(272-283)Online publication date: 1-Dec-2016
  • (2016)Interactive, Tangible and Multi-sensory Technology for a Cultural Heritage Exhibition: The Battle of PaviaInnovative Approaches and Solutions in Advanced Intelligent Systems10.1007/978-3-319-32207-0_6(77-94)Online publication date: 30-Apr-2016
  • (2014)Image Segmentation Scrutiny by Eye TrackingProceedings of the 2014 Tenth International Conference on Signal-Image Technology and Internet-Based Systems10.1109/SITIS.2014.19(554-559)Online publication date: 23-Nov-2014

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media