Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2984511.2984514acmconferencesArticle/Chapter ViewAbstractPublication PagesuistConference Proceedingsconference-collections
research-article

Gaze and Touch Interaction on Tablets

Published: 16 October 2016 Publication History
  • Get Citation Alerts
  • Abstract

    We explore how gaze can support touch interaction on tablets. When holding the device, the free thumb is normally limited in reach, but can provide an opportunity for indirect touch input. Here we propose gaze and touch input, where touches redirect to the gaze target. This provides whole-screen reachability while only using a single hand for both holding and input. We present a user study comparing this technique to direct-touch, showing that users are slightly slower but can utilise one-handed use with less physical effort. To enable interaction with small targets, we introduce CursorShift, a method that uses gaze to provide users temporal control over cursors during direct-touch interactions. Taken together, users can employ three techniques on tablets: direct-touch, gaze and touch, and cursor input. In three applications, we explore how these techniques can coexist in the same UI and demonstrate how tablet tasks can be performed with thumb-only input of the holding hand, and with it describe novel interaction techniques for gaze based tablet interaction.

    Supplementary Material

    suppl.mov (uist1306-file3.mp4)
    Supplemental video
    MP4 File (p301-pfeuffer.mp4)

    References

    [1]
    Baudisch, P., and Chu, G. Back-of-device interaction allows creating very small touch devices. In CHI '09, ACM (2009), 1923--1932.
    [2]
    Bergstrom-Lehtovirta, J., and Oulasvirta, A. Modeling the functional area of the thumb on mobile touchscreen surfaces. In CHI '14, ACM (2014), 1991--2000.
    [3]
    Bulling, A., and Gellersen, H. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4 (Oct. 2010), 8--12.
    [4]
    Cheng, L.-P., Liang, H.-S., Wu, C.-Y., and Chen, M. Y. igrasp: Grasp-based adaptive keyboard for mobile devices. In CHI '13, ACM (2013), 3037--3046.
    [5]
    Drewes, H., De Luca, A., and Schmidt, A. Eye-gaze interaction for mobile phones. In Mobility '07, ACM (2007), 364--371.
    [6]
    Drewes, H., and Schmidt, A. The magic touch: Combining magic-pointing with a touch-sensitive mouse. In INTERACT '09, Springer-Verlag (Berlin, Heidelberg, 2009), 415--428.
    [7]
    Dybdal, M. L., Agustin, J. S., and Hansen, J. P. Gaze input for mobile devices by dwell and gestures. In ETRA '12, ACM (2012), 225--228.
    [8]
    Esteves, A., Velloso, E., Bulling, A., and Gellersen, H. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In UIST '15, ACM (2015), 457--466.
    [9]
    Goel, M., Wobbrock, J., and Patel, S. Gripsense: Using built-in sensors to detect hand posture and pressure on commodity mobile phones. In UIST '12, ACM (New York, NY, USA, 2012), 545--554.
    [10]
    Hinckley, K., Heo, S., Pahud, M., Holz, C., Benko, H., Sellen, A., Banks, R., OHara, K., Smyth, G., and Buxton, B. Pre-touch sensing for mobile interaction. In CHI '16, ACM (2016), to appear.
    [11]
    Hinckley, K., Pahud, M., Benko, H., Irani, P., Guimbreti'ere, F., Gavriliu, M., Chen, X. A., Matulic, F., Buxton, W., and Wilson, A. Sensing techniques for tablet+stylus interaction. In UIST '14, ACM (2014), 605--614.
    [12]
    Hohlfeld, O., Pomp, A., Link, J. A. B., and Guse, D. On the applicability of computer vision based gaze tracking in mobile scenarios. In MobileHCI '15, ACM (2015), 427--434.
    [13]
    Holz, C., and Baudisch, P. The generalized perceived input point model and how to double touch accuracy by extracting fingerprints. In CHI'10, ACM (2010), 581--590.
    [14]
    Karlson, A. K., and Bederson, B. B. Thumbspace: Generalized one-handed input for touchscreen-based mobile devices. In INTERACT'07, Springer-Verlag (Berlin, Heidelberg, 2007), 324--338.
    [15]
    Kim, S., Yu, J., and Lee, G. Interaction techniques for unreachable objects on the touchscreen. In OzCHI '12, ACM (2012), 295--298.
    [16]
    Kumar, M., Klingner, J., Puranik, R., Winograd, T., and Paepcke, A. Improving the accuracy of gaze input for interaction. In ETRA '08, ACM (2008), 65--68.
    [17]
    Negulescu, M., and McGrenere, J. Grip change as an information side channel for mobile touch interaction. In CHI '15, ACM (2015), 1519--1522.
    [18]
    Odell, D., and Chandrasekaran, V. Enabling comfortable thumb interaction in tablet computers: A windows 8 case study. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 56, SAGE Publications (2012), 1907--1911.
    [19]
    Pfeuffer, K., Alexander, J., Chong, M. K., and Gellersen, H. Gaze-touch: Combining gaze with multi-touch for interaction on the same surface. In UIST '14, ACM (2014), 509--518.
    [20]
    Pfeuffer, K., Alexander, J., Chong, M. K., Zhang, Y., and Gellersen, H. Gaze-shifting: Direct-indirect input with pen and touch modulated by gaze. In UIST '15, ACM (2015), 373--383.
    [21]
    Pfeuffer, K., Alexander, J., and Gellersen, H. Gaze+touch vs. touch: Whats the trade-off when using gaze to extend touch to remote displays' In INTERACT'15, vol. 9297, Springer (2015), 349--367.
    [22]
    Pfeuffer, K., Alexander, J., and Gellersen, H. Partially-indirect bimanul input with gaze, pen, and touch for pan, zoom, and ink interaction. In CHI '16, ACM (2016), 373--383.
    [23]
    Roetting, M., and Zukunft, O. Don't touch that tablet: An evaluation of gaze-based interfaces for tablet computers. IEEE (11 2014).
    [24]
    Roudaut, A., Huot, S., and Lecolinet, E. Taptap and magstick: Improving one-handed target acquisition on small touch-screens. In AVI '08, ACM (2008), 146--153.
    [25]
    Serim, B., and Jacucci, G. Pointing while looking elsewhere: Designing for varying degrees of visual guidance during manual input. In CHI '16, ACM (2016), to appear.
    [26]
    Stellmach, S., and Dachselt, R. Look & touch: Gaze-supported target acquisition. In CHI '12, ACM (2012), 2981--2990.
    [27]
    Stellmach, S., and Dachselt, R. Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In CHI'13, ACM (2013), 285--294.
    [28]
    The Eye Tribe. The eye tribe tracker. In https://theeyetribe.com/ (04/2016).
    [29]
    Trudeau, M. B., Catalano, P. J., Jindrich, D. L., and Dennerlein, J. T. Tablet keyboard configuration affects performance, discomfort and task difficulty for thumb typing in a two-handed grip. PloS one 8, 6 (2013), e67525.
    [30]
    Turner, J., Alexander, J., Bulling, A., and Gellersen, H. Gaze+rst: Integrating gaze and multitouch for remote rotate-scale-translate tasks. In CHI '15, ACM (2015), 4179--4188.
    [31]
    Voelker, S., Matviienko, A., Schöning, J., and Borchers, J. Combining direct and indirect touch input for interactive workspaces using gaze input. In SUI '15, ACM (2015), 79--88.
    [32]
    Wagner, J., Huot, S., and Mackay, W. Bitouch and bipad: Designing bimanual interaction for hand-held tablets. In CHI '12, ACM (2012), 2317--2326.
    [33]
    Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., and Shen, C. Lucid touch: A see-through mobile device. In UIST '07, ACM (2007), 269--278.
    [34]
    Wolf, K., and Henze, N. Comparing pointing techniques for grasping hands on tablets. In MobileHCI '14, ACM (2014), 53--62.
    [35]
    Wolf, K., Schneider, M., Mercouris, J., and Hrabia, C.-E. Biomechanics of front and back-of-tablet pointing with grasping hands. Int. J. Mob. Hum. Comput. Interact. 7, 2 (Apr. 2015), 43--64.
    [36]
    Wood, E., and Bulling, A. Eyetab: Model-based gaze estimation on unmodified tablet computers. In ETRA '14, ACM (2014), 207--210.
    [37]
    Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In CHI'99, ACM (1999), 246--253.

    Cited By

    View all
    • (2024)Investigating the Effects of Eye-Tracking Interpolation Methods on Model Performance of LSTMProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656353(1-6)Online publication date: 4-Jun-2024
    • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
    • (2024)Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642915(1-16)Online publication date: 11-May-2024
    • Show More Cited By

    Index Terms

    1. Gaze and Touch Interaction on Tablets

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      UIST '16: Proceedings of the 29th Annual Symposium on User Interface Software and Technology
      October 2016
      908 pages
      ISBN:9781450341899
      DOI:10.1145/2984511
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 16 October 2016

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. cursor
      2. eye tracking
      3. gaze
      4. indirect input
      5. tablet
      6. touch

      Qualifiers

      • Research-article

      Conference

      UIST '16

      Acceptance Rates

      UIST '16 Paper Acceptance Rate 79 of 384 submissions, 21%;
      Overall Acceptance Rate 842 of 3,967 submissions, 21%

      Upcoming Conference

      UIST '24

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)149
      • Downloads (Last 6 weeks)17
      Reflects downloads up to 27 Jul 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Investigating the Effects of Eye-Tracking Interpolation Methods on Model Performance of LSTMProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3656353(1-6)Online publication date: 4-Jun-2024
      • (2024)Expanding V2X with V2DUIs: Distributed User Interfaces for Media Consumption in the Vehicle-to-Everything EraProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3663643(394-401)Online publication date: 7-Jun-2024
      • (2024)Gaze on the Go: Effect of Spatial Reference Frame on Visual Target Acquisition During Physical Locomotion in Extended RealityProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642915(1-16)Online publication date: 11-May-2024
      • (2024)GazePrompt: Enhancing Low Vision People's Reading Experience with Gaze-Aware AugmentationsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642878(1-17)Online publication date: 11-May-2024
      • (2024)Uncovering and Addressing Blink-Related Challenges in Using Eye Tracking for Interactive SystemsProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642086(1-23)Online publication date: 11-May-2024
      • (2024)EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00088(681-689)Online publication date: 16-Mar-2024
      • (2024)Evaluating Target Expansion for Eye Pointing TasksInteracting with Computers10.1093/iwc/iwae00436:4(209-223)Online publication date: 27-Feb-2024
      • (2023)PalmGazer: Unimanual Eye-hand Menus in Augmented RealityProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614523(1-12)Online publication date: 13-Oct-2023
      • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
      • (2023)Gaze-based Mode-Switching to Enhance Interaction with Menus on TabletsProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588409(1-8)Online publication date: 30-May-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media