Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Gazing and frowning as a new human--computer interaction technique

Published: 01 July 2004 Publication History

Abstract

The present aim was to study a new technique for human--computer interaction. It combined the use of two modalities, voluntary gaze direction and voluntary facial muscle activation for object pointing and selection. Fourteen subjects performed a series of pointing tasks with the new technique and with a mouse. At short distances the mouse was significantly faster than the new technique. However, there were no statistically significant differences at medium and long distances between the techniques. Fitts' law analyses were performed both by using only error-free trials and using also data including error trials (i.e., effective target width). In all cases both techniques seemed to follow Fitts' law, although for the new technique the effective target width correlation coefficient was smaller R = 0.776 than for the mouse R = 0.991. The regression slopes suggested that at very long distances (i.e., beyond 800 pixels) the new technique might be faster than the mouse. The new technique showed promising results already after a short practice and in the future it could be useful especially for physically challenged persons.

References

[1]
Allanson, J., Rodden, T., and Mariani, J. 1999. A toolkit for exploring electro-physiological human--computer interaction. In Proceedings of Human--Computer Interaction---INTERACT '99. IOS Press, Amsterdam, 231--237.
[2]
Accot, J. and Zhai, S. 2003. Refining Fitts' law models for bivariate pointing. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2003). ACM Press, New York, 193--200.
[3]
Barreto, A. B., Scargle, S. D., and Adjouadi, M. 2000. A practical EMG-based human--computer interface for users with motor disabilities. J. Rehabil. Res. Dev. 37, 53--63.
[4]
Card, S. K., English, W. K., and Burr, B. J. 1978. Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys for text selection on a CRT. Ergonomics 21, 601--613.
[5]
Dimberg, U. 1990. Facial electromyography and emotional reactions. Psychophysiology 27, 481--494.
[6]
Doherty, E., Bloor, C., and Cockton, G. 1999. The "cyberlink" brain-body interface as an assistive technology for persons with traumatic brain injury: Longitudinal results from a group of case studies. CyberPsychology Behav. 2, 249--259.
[7]
Douglas, S. A. and Mithal, A. K. 1994. The effect of reducing homing time on the speed of a finger-controlled isometric pointing device. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '94). ACM Press, New York, 411--416.
[8]
Douglas, S. A., Kirkpatrick, A. E., and MacKenzie, I. S. 1999. Testing pointing device performance and user assessment with the ISO 9241, Part 9 standard. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM Press, New York, 215--222.
[9]
Ekman, P. and Friesen, W. V. 1978. Facial Action Coding System (FACS): A Technique for the Measurement of Facial Action. Consulting Psychologists Press, Palo Alto, CA.
[10]
Fitts, P. M. 1954. The information capacity of the human motor system in controlling the amplitude of movement. J. Exp. Psychol. 47, 381--391.
[11]
Fitts, P. M. and Peterson J. R. 1964. Information capacity of discrete motor responses. J. Exp. Psychol. 67, 103--112.
[12]
Fridlund, A. J. 1991. Evolution and facial action in reflex, social motive, and paralanguage. Biol. Psychol. 32, 3--100.
[13]
Fridlund, A. J. and Cacioppo, J. T. 1986. Guidelines for human electromyographic research. Psychophysiology 23, 567--589.
[14]
Gillan, D. J., Holden, K., Adam, S., Rudisill, M., and Magee, L. 1990. How does Fitt's law fit pointing and dragging? In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM Press, New York, 227--234.
[15]
Hietanen, J. K., Surakka, V., and Linnankoski, I. 1998. Facial electromyographic responses to vocal affect expressions. Psychophysiology 35, 530--536.
[16]
Jacob, R. J. K. 1991. The use of eye movements in human computer interaction techniques: What you look at is what you get. ACM Trans. Inf. Syst. 9, 152--169.
[17]
Jacob, R. J. K. and karn, K., S. 2003. Eye tracking in human--computer interaction and usability research: Ready to deliver the promises. In The Mind's Eyes: Cognitive and Applied Aspects of Oculomotor Research, J. Hyönä, R. Radach, and H. Deubel, eds. Elsevier Science, Oxford, 573--605.
[18]
Kübler, A., Kotchoubey, B., Hinterberger, T., Ghanayim, N., Perelmouter, J., Schauer, M., Fritsch, C., Taub, E., and Birbaumer, N. 1999. The thought translation device: A neurophysiological approach to communication in total motor paralysis. Exp. Brain Res. 124, 223--232.
[19]
Laakso, J., Juhola, M., and Surakka, V. 2002. Neural network recognition of electromyographic signals of two facial muscle sites. In Proceedings of Medical Informatics Europe 2002. 83--87.
[20]
Laakso, J. Juhola, M. Surakka, V., Aula, A., and Partala, T. 2001. Neural network and wavelet recognition of facial electromyographic signals. In Proceedings of 10th World Congress on Health and Medical Informatics, V. Patel, R. Rogers, and R. Haux, Eds. IOS Press, Amsterdam, 489--492.
[21]
Lauer, R. T., Peckham, H., Kilgore, K. L., and Heetderks, W. J. 2000. Applications of cortical signals to neuroprosthetic control: A critical review. IEEE Trans. Rehabil. Engng 8, 205--208.
[22]
Lusted, H. S. and Knapp, B. 1996. Controlling computers with neural signals. Sci. Am. 275, 82--87.
[23]
MacKenzie, I. S. 1992. Movement time prediction in human--computer interfaces. In Proceedings of Graphics Interface '92, Canadian Information Processing Society. 140--150.
[24]
MacKenzie, I. S. 1995. Movement time prediction in human--computer interfaces. In Readings in Human--Computer Interaction (2nd ed.), R. M. Baecker, W. A. S. Buxton, J. Grudin, and S. Greenberg, Eds. Kauffman, Los Altos, CA, 483--493.
[25]
MacKenzie, I. S., Kauppinen, T., and Silfverberg, M. 2001. Accuracy measures for evaluating computer pointing techniques. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2001). ACM Press, New York, 9--16.
[26]
Miniotas, D. 2000. Application of Fitts' law to eye gaze interaction. In Extended Abstracts of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2000). ACM Press, New York, 339--340.
[27]
Moore, M. and Kennedy, R. K. 2000. Human factors issues in the neural signals direct brain--computer interface. In Proceedings of the Fourth International ACM Conference on Assistive Technologies. ACM Press, New York, 114--120.
[28]
Partala, T., Aula, A., and Surakka, V. 2001. Combined voluntary gaze direction and facial muscle activity as a new pointing technique. In Proceedings of INTERACT 2001. IOS Press, Amsterdam, 100--107.
[29]
Partala, T. and Surakka, V. 2003. Pupil size variation as an indication of affective processing. Int. J. Human Comput. Stud. 59, 1--2, 185--198.
[30]
Rayner, K. 1998. Eye movements in reading and information processing: 20 years of research. Psychol. Bull. 124, 372--422.
[31]
Salvucci, D. D. 1999. Inferring intent in eye-based interfaces: Tracing eye movements with process models. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM Press, New York, 254--261.
[32]
Salvucci, D. D. and Anderson, J. R. 2000. Intelligent gaze-added interfaces. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2000). ACM Press, New York, 273--280.
[33]
Sibert, L. E. and Jacob, R. J. K. 2000. Evaluation of eye gaze interaction. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2000). ACM Press, New York, 281--288.
[34]
Stampe, D. M. and Reingold, E. M. 1995. Selection by looking: A novel computer interface and its application to psychological research. In Eye Movement Research: Mechanisms, Processes and Applications, J. M. Findlay, R. Walker, and R. W. Kentridge, Eds. Elsevier Science, Amsterdam, 467--478.
[35]
Surakka, V. and Hietanen, J. K. 1998. Facial and emotional reactions to Duchenne and non-Duchenne smiles. Int. J. Psychophysiol. 29, 23--33.
[36]
Surakka, V., Illi, M., and Isokoski, P. 2003. Voluntary eye movements in human--computer interaction. In The Mind's Eyes: Cognitive and Applied Aspects of Oculomotor Research, J. Hyönä, R. Radach, and H. Deubel, Eds. Elsevier Science, Oxford, 473--491.
[37]
Tanriverdi, V. and Jacob, R. J. K. 2000. Interacting with eye movements in virtual environments. In Proceedings of CHI'2000 Human Factors in Computing Systems. ACM Press, 265--272.
[38]
Tecce, J. T., Gips, J., Olivieri, C. P., Pok, L. J., and Consiglio, M. R. 1998. Eye movement control of computer functions. Int. J. Psychophysiol. 29, 319--325.
[39]
Vidal, J. J. 1973. Toward direct brain--computer communication. Ann. Rev. Biophys. Engng 2, 157--180.
[40]
van Boxtel, A. and Jessurum, M. 1993. Amplitude and bilateral coherency of facial and jaw-elevator EMG activity as an index of effort during a two-choice serial reaction task. Psychophysiology 30, 1065--1079.
[41]
Ware, C., and Mikaelian, H. H. 1987. An evaluation of an eyetracker as a device for computer input. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '87). ACM Press, New York, 183--188.
[42]
Wolpaw, R. J., Birbaumer, N., Mcfarland, D. J., Pfurtscheller, G., and Vaughan, T. M. 2002. Brain--computer interfaces for communication and control. Clin. Neurophysiol. 113, 767--791.
[43]
Zhai, S. 2003. What's in the eyes for attentive input. Commun. ACM 46, 34--39.
[44]
Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM Press, New York, 246--253.

Cited By

View all
  • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
  • (2023)Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn DevicesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583930(1-4)Online publication date: 19-Apr-2023
  • (2023)Trigger motion and interface optimization of an eye-controlled human-computer interaction system based on voluntary eye blinksHuman–Computer Interaction10.1080/07370024.2023.219585039:5-6(472-502)Online publication date: 24-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Transactions on Applied Perception
ACM Transactions on Applied Perception  Volume 1, Issue 1
July 2004
80 pages
ISSN:1544-3558
EISSN:1544-3965
DOI:10.1145/1008722
Issue’s Table of Contents
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2004
Published in TAP Volume 1, Issue 1

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gaze direction
  2. electromyography
  3. facial muscle activity

Qualifiers

  • Article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)41
  • Downloads (Last 6 weeks)4
Reflects downloads up to 08 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
  • (2023)Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn DevicesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583930(1-4)Online publication date: 19-Apr-2023
  • (2023)Trigger motion and interface optimization of an eye-controlled human-computer interaction system based on voluntary eye blinksHuman–Computer Interaction10.1080/07370024.2023.219585039:5-6(472-502)Online publication date: 24-Apr-2023
  • (2022)Identification of Conditions with High Speed and Accuracy of Target Prediction Method by Switching from Ballistic Eye Movement to Homing Eye Movement視線移動におけるballistic動作からhoming動作への切替を利用したターゲット予測法の高速・高精度な使用条件の同定The Japanese Journal of Ergonomics10.5100/jje.58.16358:4(163-173)Online publication date: 15-Aug-2022
  • (2022)Design and Evaluation of a Silent Speech-Based Selection Method for Eye-Gaze PointingProceedings of the ACM on Human-Computer Interaction10.1145/35677236:ISS(328-353)Online publication date: 14-Nov-2022
  • (2022)Kuiper Belt: Utilizing the “Out-of-natural Angle” Region in the Eye-gaze Interaction for Virtual RealityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517725(1-17)Online publication date: 29-Apr-2022
  • (2022)Bivariate Effective Width Method to Improve the Normalization Capability for Subjective Speed-accuracy Biases in Rectangular-target PointingProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517466(1-13)Online publication date: 29-Apr-2022
  • (2021)Evaluating the Effects of Saccade Types and Directions on Eye Pointing TasksThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474818(1221-1234)Online publication date: 10-Oct-2021
  • (2021)An Adaptive Model of Gaze-based SelectionProceedings of the 2021 CHI Conference on Human Factors in Computing Systems10.1145/3411764.3445177(1-11)Online publication date: 6-May-2021
  • (2021)Development of an Eye-Gaze Input System With High Speed and Accuracy through Target Prediction Based on Homing Eye MovementsIEEE Access10.1109/ACCESS.2021.30555149(22688-22697)Online publication date: 2021
  • Show More Cited By

View Options

Login options

Full Access

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media