Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3317956.3318154acmconferencesArticle/Chapter ViewAbstractPublication PagesetraConference Proceedingsconference-collections
research-article

GazeButton: enhancing buttons with eye gaze interactions

Published: 25 June 2019 Publication History

Abstract

The button is an element of a user interface to trigger an action, traditionally using click or touch. We introduce GazeButton, a novel concept extending the default button mode with advanced gaze-based interactions. During normal interaction, users can utilise this button as a universal hub for gaze-based UI shortcuts. The advantages are: 1) easy to integrate in existing UIs, 2) complementary, as users choose either gaze or manual interaction, 3) straightforward, as all features are located in one button, and 4) one button to interact with the whole screen. We explore GazeButtons for a custom-made text reading, writing, and editing tool on a multitouch tablet device. For example, this allows the text cursor position to be set as users look at the position and tap on the GazeButton, avoiding costly physical movement. Or, users can simply gaze over a part of the text that should be selected, while holding the GazeButton. We present a design space, specific application examples, and point to future button designs that become highly expressive by unifying the user's visual and manual input.

References

[1]
Roman Bednarik, Tersia Gowases, and Markku Tukiainen. 2009. Gaze interaction enhances problem solving: Effects of dwell-time based, gaze-augmented, and mouse interaction on problem-solving strategies and user experience.
[2]
Joanna Bergstrom-Lehtovirta and Antti Oulasvirta. 2014. Modeling the Functional Area of the Thumb on Mobile Touchscreen Surfaces. In CHI '14. ACM, 1991--2000.
[3]
Richard A. Bolt. 1981. Gaze-orchestrated Dynamic Windows. SIGGRAPH Comput. Graph. 15, 3 (Aug. 1981), 109--119.
[4]
Andreas Bulling and Hans Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 4 (2010), 8--12.
[5]
Lung-Pan Cheng, Hsiang-Sheng Liang, Che-Yang Wu, and Mike Y. Chen. 2013. iGrasp: Grasp-based Adaptive Keyboard for Mobile Devices. In CHI '13. ACM, 3037--3046.
[6]
Heiko Drewes, Alexander De Luca, and Albrecht Schmidt. 2007. Eye-gaze Interaction for Mobile Phones. In Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Symposium on Computer Human Interaction in Mobile Technology (Mobility '07). ACM, New York, NY, USA, 364--371.
[7]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches Using Smooth Pursuit Eye Movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 457--466.
[8]
Ken Hinckley, Seongkook Heo, Michel Pahud, Christian Holz, Hrvoje Benko, Abigail Sellen, Richard Banks, Kenton OâĂŹHara, Gavin Smyth, and Bill Buxton. 2016. Pre-Touch Sensing for Mobile Interaction. In CHI '16. ACM, to appear.
[9]
Oliver Hohlfeld, André Pomp, Jó Ágila Bitsch Link, and Dennis Guse. 2015. On the applicability of computer vision based gaze tracking in mobile scenarios. In Proceedings of the 17th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 427--434.
[10]
Anke Huckauf and Mario H. Urbina. 2008. Gazing with pEYEs: Towards a Universal Input for Various Applications. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 51--54.
[11]
Robert J. K.Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18.
[12]
Mohamed Khamis, Florian Alt, and Andreas Bulling. 2018a. The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). ACM, New York, NY, USA, Article 38, 17 pages.
[13]
Mohamed Khamis, Ludwig Trotter, Ville Mäkelä, Emanuel von Zezschwitz, Jens Le, Andreas Bulling, and Florian Alt. 2018b. CueAuth: Comparing Touch, Mid-Air Gestures, and Gaze for Cue-based Authentication on Situated Displays. IMWUT 2, 4 (2018), 174:1--174:22. https://dl.acm.org/citation.cfm?id=3287052
[14]
Andrew Kurauchi, Wenxin Feng, Ajjen Joshi, Carlos Morimoto, and Margrit Betke. 2016. EyeSwipe: Dwell-free text entry using gaze paths. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM, 1952--1956.
[15]
Päivi Majaranta and Kari-Jouko Räihä. 2002. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 symposium on Eye tracking research & applications. ACM, 15--22.
[16]
Dan Odell and Vasudha Chandrasekaran. 2012. Enabling comfortable thumb interaction in tablet computers: A windows 8 case study. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, Vol. 56. SAGE Publications, 1907--1911.
[17]
Antti Oulasvirta, Anna Reichel, Wenbin Li, Yan Zhang, Myroslav Bachynskyi, Keith Vertanen, and Per Ola Kristensson. 2013. Improving Two-thumb Text Entry on Touchscreen Devices. In CHI '13. ACM, 2765--2774.
[18]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology (UIST '14). ACM, New York, NY, USA, 509--518.
[19]
Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. Gaze-Shifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. In UIST '15. ACM, 373--383.
[20]
Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16). ACM, New York, NY, USA, 301--311.
[21]
Ken Pfeuffer, Ken Hinckley, Michel Pahud, and Bill Buxton. 2017. Thumb + Pen Interaction on Tablets. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI '17). ACM, New York, NY, USA, 3254--3266.
[22]
Vijay Rajanna and John Paulin Hansen. 2018. Gaze Typing in Virtual Reality: Impact of Keyboard Design, Selection Method, and Motion. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 15, 10 pages.
[23]
Korok Sengupta, Raphael Menges, Chandan Kumar, and Steffen Staab. 2017. Gazethekey: Interactive keys to integrate word predictions for gaze-based text entry. In Proceedings of the 22nd International Conference on Intelligent User Interfaces Companion. ACM, 121--124.
[24]
Baris Serim and Giulio Jacucci. 2016. Pointing While Looking Elsewhere: Designing for Varying Degrees of Visual Guidance During Manual Input. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 5789--5800.
[25]
Ben Shneiderman. 1983. Direct Manipulation: A Step Beyond Programming Languages. Computer 16, 8 (Aug. 1983), 57--69.
[26]
Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of eye gaze interaction. In CHI. ACM, 281--288.
[27]
Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In CHI '12. ACM, 2981--2990.
[28]
Tobii. 2019. Tobii EyeX. In https://gaming.tobii.com/products/peripherals/.
[29]
Matthieu B Trudeau, Paul J Catalano, Devin L Jindrich, and Jack T Dennerlein. 2013. Tablet keyboard configuration affects performance, discomfort and task difficulty for thumb typing in a two-handed grip. PloS one 8, 6 (2013), e67525.
[30]
Outi Tuisku, Päivi Majaranta, Poika Isokoski, and Kari-Jouko Räihä. 2008. Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 19--26.
[31]
Jayson Turner, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks. In CHI '15. ACM, 4179--4188.
[32]
Jayson Turner, Andreas Bulling, Jason Alexander, and Hans Gellersen. 2014. Cross-device Gaze-supported Point-to-point Content Transfer. In ETRA'14. ACM, 19--26.
[33]
Jayson Turner, Andreas Bulling, and Hans Gellersen. 2011. Combining Gaze with Manual Interaction to Extend Physical Reach. In PETMEI '11. ACM, 33--36.
[34]
Julie Wagner, Stephane Huot, and Wendy Mackay. 2012. BiTouch and BiPad: Designing Bimanual Interaction for Hand-held Tablets. In CHI '12. ACM, 2317--2326.
[35]
Katrin Wolf and Niels Henze. 2014. Comparing Pointing Techniques for Grasping Hands on Tablets. In MobileHCI '14. ACM, 53--62.
[36]
Katrin Wolf, Markus Schneider, John Mercouris, and Christopher-Eyk Hrabia. 2015. Biomechanics of Front and Back-of-Tablet Pointing with Grasping Hands. Int. J. Mob. Hum. Comput. Interact. 7, 2 (April 2015), 43--64.
[37]
Erroll Wood and Andreas Bulling. 2014. Eyetab: Model-based gaze estimation on unmodified tablet computers. In Proceedings of the Symposium on Eye Tracking Research and Applications. ACM, 207--210.
[38]
Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In CHI'99. ACM, 246--253.

Cited By

View all
  • (2024)Beyond Aesthetics: Evaluating Response Widgets for Reliability & Construct Validity of Scale QuestionnairesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650751(1-7)Online publication date: 11-May-2024
  • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
  • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
  • Show More Cited By

Index Terms

  1. GazeButton: enhancing buttons with eye gaze interactions

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ETRA '19: Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications
    June 2019
    623 pages
    ISBN:9781450367097
    DOI:10.1145/3314111
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 25 June 2019

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. interaction modality
    2. text input
    3. touch and gaze

    Qualifiers

    • Research-article

    Conference

    ETRA '19

    Acceptance Rates

    Overall Acceptance Rate 69 of 137 submissions, 50%

    Upcoming Conference

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)99
    • Downloads (Last 6 weeks)12
    Reflects downloads up to 15 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Beyond Aesthetics: Evaluating Response Widgets for Reliability & Construct Validity of Scale QuestionnairesExtended Abstracts of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613905.3650751(1-7)Online publication date: 11-May-2024
    • (2023)Exploring Gaze-assisted and Hand-based Region Selection in Augmented RealityProceedings of the ACM on Human-Computer Interaction10.1145/35911297:ETRA(1-19)Online publication date: 18-May-2023
    • (2023)Gaze-based Interaction on Handheld Mobile DevicesProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589540(1-4)Online publication date: 30-May-2023
    • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)i-GSI: A Novel Grasp Switching Interface Based on Eye-Tracking and Augmented Reality for Multi-Grasp Prosthetic HandsIEEE Robotics and Automation Letters10.1109/LRA.2023.32403758:3(1619-1626)Online publication date: Mar-2023
    • (2023)Exploring Trajectory Data in Augmented Reality: A Comparative Study of Interaction Modalities2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00094(790-799)Online publication date: 16-Oct-2023
    • (2023)A Non-Contact Human-Computer Interaction Method Based on Gaze2023 IEEE 13th International Conference on Electronics Information and Emergency Communication (ICEIEC)10.1109/ICEIEC58029.2023.10199950(70-74)Online publication date: 14-Jul-2023
    • (2022)A Practical Method to Eye-tracking on the Phone: Toolkit, Accuracy and PrecisionProceedings of the 21st International Conference on Mobile and Ubiquitous Multimedia10.1145/3568444.3568463(182-188)Online publication date: 27-Nov-2022
    • (2022)See through themProceedings of the 2nd Workshop on Games Systems10.1145/3534085.3534338(1-4)Online publication date: 14-Jun-2022
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media