Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1344471.1344477acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Eye-S: a full-screen input modality for pure eye-based communication

Published: 26 March 2008 Publication History

Abstract

To date, several eye input methods have been developed, which, however, are usually designed for specific purposes (e.g. typing) and require dedicated graphical interfaces. In this paper we present Eye-S, a system that allows general input to be provided to the computer through a pure eye-based approach. Thanks to the "eye graffiti" communication style adopted, the technique can be used both for writing and for generating other kinds of commands. In Eye-S, letters and general eye gestures are created through sequences of fixations on nine areas of the screen, which we call hotspots. Being usually not visible, such sensitive regions do not interfere with other applications, that can therefore exploit all the available display space.

References

[1]
Fejtová, M., Novák, P., Fejt, J., and Štěpánková, O. 2006. When can eyes make up for hands? In Proceedings of the 2nd Conference on Communication by Gaze Interaction (COGAIN 2006), Turin, Italy, September 4--5, 46--49.
[2]
Gips, J., and Olivieri, P. 1996. EagleEyes: An Eye Control System for Persons with Disabilities. In Proceedings of the Eleventh International Conference on Technology and Persons with Disabilities, Los Angeles, California, USA, March.
[3]
Hansen, J. P., Johansen, A. S., Hansen, D. W., Itoh, K., and Mashino, S. 2003. Command Without a Click: Dwell Time Typing by Mouse and Gaze Selections. In Proceedings of INTERACT 2003, Zürich, Switzerland, September 1--5.
[4]
Isokoski, P. 2000. Text Input Methods for Eye Trackers Using Off-Screen Targets. In Proceedings of ETRA 2000, Palm Beach Gardens, FL, USA, 15--21.
[5]
Lankford, C. 2000. Effective Eye-Gaze Input into Windows. In Proceedings of ETRA 2000, Palm Beach Gardens, FL, USA, November 6--8, 23--27.
[6]
Majaranta, P., and Räihä, K. 2002. Twenty Years of Eye Typing: Systems and Design Issues. In Proceedings of ETRA 2002, New Orleans, Lousiana, USA, 15--22.
[7]
Majaranta, P., MacKenzie, I. S., and Räihä, K. 2003. Using motion to guide the focus of gaze during eye typing. In Proceedings of 12th European Conference on Eye Movements (ECEM12), Dundee, Scotland, August.
[8]
Majaranta, P., Aula, A., and Räihä, K. 2004. Effects of Feedback on Eye Typing with a Short Dwell Time. In Proceedings of ETRA 2004, San Antonio, Texas, USA, March 22--24, 139--146.
[9]
Milekic, S. 2003. The More You Look the More You Get: Intention-Based Interface Using Gaze Tracking. In Proceedings of the 7th Annual Museum and the Web Conference, Charlotte, North Carolina, USA, March 19--22.
[10]
Miniotas, D., Spakov, O., and Evreinov, G. 2003. Symbol Creator: An Alternative Eye-based Text Entry Technique with Low Demand for Screen Space. In Proceedings of INTERACT '03, M. Rauterberg et al. (Eds.), IOS Press, IFIP, 137--143.
[11]
Ohno, T., and Mukawa, N. 2003. Gaze-Based Interaction for Anyone, Anytime. In Proceedings of HCI International 2003, Crete, Greece, June 22--27, Vol. 4, 1452--1456.
[12]
Rasmusson, D., Chappell, R., and Trego, M. 1999. Quick Glance: Eye Tracking Access to the Windows95 Operating Environment. In Proceedings of the 14th International Conference on Technology and Persons with Disabilities, Los Angeles, CA, USA, March 15--20.
[13]
Salvucci, D. 1999. Inferring Intent in Eye-Based Interfaces: Tracing Eye Movements with Process Models. In Proceedings of CHI '99 (Conference on Human Factors in Computing Systems), New York, ACM Press, 254--261.
[14]
Surakka, V., Illi, M., and Isokoski, P. 2004. Gazing and frowning as a new human-computer interaction technique. ACM Transactions on Applied Perception, Vol. 1, Issue 1 (July), 40--56.
[15]
Tobii Technology AB 2003. Tobii 1750 Eye-tracker (Release B), November '03.
[16]
Urbina, M. H., and Huckauf, A. 2007. Dwell Time Free Eye Typing Approaches. In Proceedings of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), Leicester, UK, September 3--4, 65--70.
[17]
Ward, D. J., Blackwell, A. F., and MacKey, D. J. C. 2000. Dasher - a Data Entry Interface Using Continuous Gestures and Language Models. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (UIST 2000), San Diego, CA, USA, November 5--8.
[18]
Ward, D. J., and MacKay, J. C. 2002. Fast Hands-free Writing by Gaze Direction. Nature 418, August 22, 838.
[19]
Wobbrock, J. O., Myers, B. A., and Kembel, J. A. 2003. Edge-Write: A Stylus-Based Text Entry Method Designed for High Accuracy and Stability of Motion. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (UIST '03), Vancouver, BC, Canada, November 2--5, 61--70.
[20]
Wobbrock, J. O., Rubinstein, J., Sawyer, M., and Duchowski, A. T. 2007. Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures. In Proceedings of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), Leicester, UK, September 3--4, 61--64.

Cited By

View all
  • (2024)EyeGesener: Eye Gesture Listener for Smart Glasses Interaction Using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785418:3(1-28)Online publication date: 9-Sep-2024
  • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
  • (2024)Translated Pattern-Based Eye-Writing Recognition Using Dilated Causal Convolution NetworkIEEE Access10.1109/ACCESS.2024.339074612(59079-59092)Online publication date: 2024
  • Show More Cited By

Index Terms

  1. Eye-S: a full-screen input modality for pure eye-based communication

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
      March 2008
      285 pages
      ISBN:9781595939821
      DOI:10.1145/1344471
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 March 2008

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. alternative communication
      2. assistive technology
      3. eye gesture
      4. eye sequence
      5. eye typing
      6. eye writing
      7. gaze interaction

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      ETRA '08
      ETRA '08: Eye Tracking Research and Applications
      March 26 - 28, 2008
      Georgia, Savannah

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)20
      • Downloads (Last 6 weeks)3
      Reflects downloads up to 25 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)EyeGesener: Eye Gesture Listener for Smart Glasses Interaction Using Acoustic SensingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36785418:3(1-28)Online publication date: 9-Sep-2024
      • (2024)40 Years of Eye Typing: Challenges, Gaps, and Emergent StrategiesProceedings of the ACM on Human-Computer Interaction10.1145/36555968:ETRA(1-19)Online publication date: 28-May-2024
      • (2024)Translated Pattern-Based Eye-Writing Recognition Using Dilated Causal Convolution NetworkIEEE Access10.1109/ACCESS.2024.339074612(59079-59092)Online publication date: 2024
      • (2024)Evaluation of several gaze control methods for a board game with no time pressureProcedia Computer Science10.1016/j.procs.2023.10.237225:C(2457-2466)Online publication date: 4-Mar-2024
      • (2024)LeyenesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103204184:COnline publication date: 1-Apr-2024
      • (2023)EEG Based Communication System by Using Artificial Neural Networks2023 Medical Technologies Congress (TIPTEKNO)10.1109/TIPTEKNO59875.2023.10359177(1-4)Online publication date: 10-Nov-2023
      • (2023)Development of a real-time eye movement-based computer interface for communication with improved accuracy for disabled people under natural head movementsJournal of Real-Time Image Processing10.1007/s11554-023-01336-120:4Online publication date: 6-Jul-2023
      • (2022)Enhancing User Experience of Eye-Controlled Systems: Design Recommendations on the Optimal Size, Distance and Shape of Interactive Components from the Perspective of Peripheral VisionInternational Journal of Environmental Research and Public Health10.3390/ijerph19171073719:17(10737)Online publication date: 29-Aug-2022
      • (2022)Usability of the super-vowel for gaze-based text entry2022 Symposium on Eye Tracking Research and Applications10.1145/3517031.3529231(1-5)Online publication date: 8-Jun-2022
      • (2022)Lattice Menu: A Low-Error Gaze-Based Marking Menu Utilizing Target-Assisted Gaze Gestures on a Lattice of Visual AnchorsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501977(1-12)Online publication date: 29-Apr-2022
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media