Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1344471.1344523acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

Snap clutch, a moded approach to solving the Midas touch problem

Published: 26 March 2008 Publication History
  • Get Citation Alerts
  • Abstract

    This paper proposes a simple approach to an old problem, that of the 'Midas Touch'. This uses modes to enable different types of mouse behavior to be emulated with gaze and by using gestures to switch between these modes. A light weight gesture is also used to switch gaze control off when it is not needed, thereby removing a major cause of the problem. The ideas have been trialed in Second Life, which is characterized by a feature-rich of set of interaction techniques and a 3D graphical world. The use of gaze with this type of virtual community is of great relevance to severely disabled people as it can enable them to be in the community on a similar basis to able-bodied participants. The assumption here though is that this group will use gaze as a single modality and that dwell will be an important selection technique. The Midas Touch Problem needs to be considered in the context of fast dwell-based interaction. The solution proposed here, Snap Clutch, is incorporated into the mouse emulator software. The user trials reported here show this to be a very promising way in dealing with some of the interaction problems that users of these complex interfaces face when using gaze by dwell.

    References

    [1]
    Amir, A., Zimet, L., Sangiovanni-Vincentelli, A., and Kao, S. 2005. An Embedded System for an Eye-Detection Sensor. Computer Vision and Image Understanding, CVIU Special Issue on Eye Detection and Tracking, 98, 1, 104--123.
    [2]
    Ashmore, M., Duchowski, A. T., and Shoemaker, G. 2005. Efficient Eye Pointing with a FishEye Lens. In Proceedings of Graphics Interface (GI 2005), 203--210.
    [3]
    Babcock, J. S., and Pelz, J. B. 2004. Building a Lightweight Eyetracking Headgear. In Proceedings of the Eye Tracking Research and Applications Symposium (ETRA 2004), 109--113.
    [4]
    Bates, R. and Istance, H. 2002. Zooming interfaces! Enhancing the Performance of Eye Controlled Pointing Devices. In Proceedings of the Fifth International ACM SICACCESS Conference on Assistive Technologies (ASSETS 02), ACM Press, 119--126.
    [5]
    Bates, R. Istance, H. 2002. Why Are Eye Mice Unpopular? A Detailed Comparison of Head and Eye Controlled Assistive Technology Pointing Devices. Universal Access in the Information Society, Volume 2, Number 3, October 2003, 280--290 Special Issue on "Countering Design Exclusion", guest-edited by Simeon Keates and John Clarkson, Springer-Verlag Heidelberg ISSN: 1615--5289
    [6]
    Bates, R. and Istance, H. 2004. Towards Eye Based Virtual Environment Interaction for Users with High-level Motor Disabilities. In Proceedings of International Conference on Disability, Virtual Reality and Associated Technologies (ICDVRAT), 275--282.
    [7]
    Bates, R., Istance, H., and Vickers, S. 2008. Gaze Interaction with Virtual On-Line Communities: Levelling the Playing Field for Disabled Users. In Proceedings of the 4th Cambridge Workshop on Universal Access and Assistive Technology (CWUAAT), University of Cambridge, 13th-16th April 2008.
    [8]
    Beymer, D., Farrell, S. P., and Zhai, S. 2005. System and Method for Selecting and Activating a Target Object Using a Combination of Eye Gaze and Key Presses. USA Patent 2005, International Business Machines Corporation.
    [9]
    Cournia, N., Smith, J. D., and Duchowski, A. T. 2003. Gaze vs. Hand-Based Pointing in Virtual Environments. In Proceedings of International ACM Conference on Human Factors in Computing Systems (CHI '03), Ft. Lauderdale (FL), April 5-10, 2003, Short Talks and Interactive Posters, New York: ACM Press, 772--773.
    [10]
    Fono, D. and Vertegaal, R. 2005. EyeWindows: Evaluation of Eye-Controlled Zooming Windows for Focus Selection. In Proceedings of International ACM Conference on Human Factors in Computing Systems (CHI '05), New York: ACM Press. 151--160.
    [11]
    Hand, C. 1997. A Survey of 3D Interaction Techniques. Computer Graphics Forum 16, 5, 269--281.
    [12]
    Hansen, D. W., MacKay, D., and Hansen, J. P. 2004. Eye Tracking off the Shelf. In Proceedings of ETRA: Eye Tracking Research & Applications Symposium. San Antonio, Texas, USA: ACM Press. 58--58, 2004.
    [13]
    Hyrskykari, A. 2006. Utilizing Eye Movements: Overcoming Inaccuracy While Tracking the Focus of Attention During Reading. In Computers in Human Behavior 22, 4, Elsevier Science, 657--671.
    [14]
    IPRIZE, 2006. A Grand Challenge for Human Computer Interaction. Available at http://hevl.hci.iastate.edu/IPRIZE/
    [15]
    Isokoski, P., Hyrskykari, A., Kotkaluoto, S., and Martin, B. 2007. Gamepad and Eye Tracker Input in FPS Games: Data for the First 50 Minutes. Proceeding of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), 11-15. Available at http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf (1.11.2007)
    [16]
    Istance, H. O., Spinner, C., and Howarth, P. A. 1996. Eye-based Control of Standard GUI Software. In Proceedings of HCI on People and Computers XI, M. A. Sasse, J. Cunningham, and R. L. Winder, Eds. Springer-Verlag, London, 141--158.
    [17]
    Jacob, R. J. K. 1993. Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces. In H. R. Hartson and D. Hix (Eds.) Advances in Human-Computer Interaction, Vol. 4, Ablex Publishing Co., Norwood, N.J. 151-190. Available at http://www.cs.tufts.edu/~jacob/papers/hartson.pdf (1.11.2007).
    [18]
    Kumar, M., Paepcke, A., and Winograd, T. 2007. EyePoint: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the SIGCHI Conference on Human factors in computing systems (CHI '07). ACM Press, New York, NY, 421--430.
    [19]
    Miniotas, D., O. Špakov, I. Tugoy, and I. S. MacKenzie. 2006. Speech-Augmented Eye Gaze Interaction with Small Closely Spaced Targets. In Proceedings of ETRA: Eye Tracking Research & Applications Symposium (ETRA 2006), ACM Press. pp. 67--72.
    [20]
    Nievergelt, J. and Weydert, J. 1987. Sites, Modes, and Trails: Telling the User of an Interactive System Where He is, What He Can Do, and How to Get to Places (excerpt). In Human-Computer interaction: A Multidisciplinary Approach, R. M. Baecker, Ed. Morgan Kaufmann Publishers, San Francisco, CA, 438--441.
    [21]
    Salvucci, D. D. and Anderson J. R. 2000. Intelligent Gaze-Added Interfaces. In Proceedings of International ACM Conference on Human Factors in Computing Systems (CHI '00), New York: ACM Press. 273--280.
    [22]
    Stein, R. 2007. Real Hope in a Virtual World. Washington Post, Oct 6th 2007. Available at http://www.washingtonpost.com/wp-dyn/content/story/2007/10/05/ST2007100502446.html?hpid=topnews (1.11.2007)
    [23]
    Tanriverdi, V. and Jacob, R. J. K. 2000. "Interacting with Eye Movements in Virtual Environments, In Proceedings of ACM CHI 2000 Human Factors in Computing Systems Conference, 265--272, Addison-Wesley/ACM Press
    [24]
    Urbina, M. H. and Huckauf A. 2007. Dwell Time Free Eye Typing Approaches. In Proceeding of the 3rd Conference on Communication by Gaze Interaction (COGAIN 2007), 65--69. Available at http://www.cogain.org/cogain2007/COGAIN2007Proceedings.pdf (1.11.2007)
    [25]
    Yamato, M., Monden, A., Matsumoto, K.-I., Inoue, K., and Torii, K. 2000. Button Selection for General GUIs Using Eye and Hand Together. In Proceedings of AVI, ACM Press, 270--273.
    [26]
    Zhai, S., Morimoto, C., and Ihde, S. 1999. Manual and Gaze Input Cascaded (Magic) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99), ACM Press, 246--253.

    Cited By

    View all
    • (2024)GraV: Grasp Volume Data for the Design of One-Handed XR InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661567(151-167)Online publication date: 1-Jul-2024
    • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
    • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
    • Show More Cited By

    Index Terms

    1. Snap clutch, a moded approach to solving the Midas touch problem

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ETRA '08: Proceedings of the 2008 symposium on Eye tracking research & applications
      March 2008
      285 pages
      ISBN:9781595939821
      DOI:10.1145/1344471
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 26 March 2008

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. disabled users
      2. eye tracking
      3. feedback
      4. gaze control
      5. gaze gestures

      Qualifiers

      • Research-article

      Conference

      ETRA '08
      ETRA '08: Eye Tracking Research and Applications
      March 26 - 28, 2008
      Georgia, Savannah

      Acceptance Rates

      Overall Acceptance Rate 69 of 137 submissions, 50%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)80
      • Downloads (Last 6 weeks)6
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)GraV: Grasp Volume Data for the Design of One-Handed XR InterfacesProceedings of the 2024 ACM Designing Interactive Systems Conference10.1145/3643834.3661567(151-167)Online publication date: 1-Jul-2024
      • (2024)Towards an Eye-Brain-Computer Interface: Combining Gaze with the Stimulus-Preceding Negativity for Target Selections in XRProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3641925(1-17)Online publication date: 11-May-2024
      • (2024)Guiding gaze gestures on smartwatchesInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103196183:COnline publication date: 14-Mar-2024
      • (2024)Using eye-tracking for real-time translation: a new approach to improving reading experienceCCF Transactions on Pervasive Computing and Interaction10.1007/s42486-024-00150-36:2(150-164)Online publication date: 9-Mar-2024
      • (2023)Gaze-Based Human–Computer Interaction for Museums and Exhibitions: Technologies, Applications and Future PerspectivesElectronics10.3390/electronics1214306412:14(3064)Online publication date: 13-Jul-2023
      • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
      • (2023)Gaze & Tongue: A Subtle, Hands-Free Interaction for Head-Worn DevicesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3583930(1-4)Online publication date: 19-Apr-2023
      • (2023)XR Input Error Mediation for Hand-Based Input: Task and Context Influences a User’s Preference2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00117(1006-1015)Online publication date: 16-Oct-2023
      • (2023)Eye-gesture-based multi-context interaction2023 IEEE 18th Conference on Industrial Electronics and Applications (ICIEA)10.1109/ICIEA58696.2023.10241952(429-434)Online publication date: 18-Aug-2023
      • (2023)Eyes can draw: A high-fidelity free-eye drawing method with unimodal gaze controlInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2022.102966170(102966)Online publication date: Mar-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media