Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2663204.2663273acmconferencesArticle/Chapter ViewAbstractPublication Pagesicmi-mlmiConference Proceedingsconference-collections
poster

Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation

Published: 12 November 2014 Publication History

Abstract

We show that users are consistent in their assessments of the articulation difficulty of multi-touch gestures, even under the many degrees of freedom afforded by multi-touch input, such as (1) various number of fingers touching the surface, (2) various number of strokes that structure the gesture shape, and (3) single-handed and bimanual input. To understand more about perceived difficulty, we characterize gesture articulations captured under these conditions with geometric and kinematic descriptors computed on a dataset of 7,200 samples of 30 distinct gesture types collected from 18 participants. We correlate the values of the objective descriptors with users' subjective assessments of articulation difficulty and report path length, production time, and gesture size as the highest correlators (max Pearson's r=.95). We also report new findings about multi-touch gesture input, e.g., gestures produced with more fingers are larger in size and take more time to produce than single-touch gestures; bimanual articulations are not only faster than single-handed input, but they are also longer in path length, present more strokes, and result in gesture shapes that are deformed horizontally by 35% in average. We use our findings to outline a number of 14 guidelines to assist multi-touch gesture set design, recognizer development, and inform gesture-to-function mappings through the prism of the user-perceived difficulty of gesture articulation.

References

[1]
Anthony, L., Brown, Q., Nias, J., and Tate, B. Examining the need for visual feedback during gesture interaction on mobile touchscreen devices for kids. IDC'13 (2013), 157--164.
[2]
Anthony, L., Vatavu, R.-D., and Wobbrock, J. O. Understanding the consistency of users' pen and finger stroke gesture articulation. GI'13 (2013), 87--94.
[3]
Appert, C., and Zhai, S. Using strokes as command shortcuts: cognitive benefits and toolkit support. CHI'09, ACM (2009), 2289--2298.
[4]
Blagojevic, R., Chang, S. H.-H., and Plimmer, B. The power of automatic feature selection: Rubine on steroids. SBIM'10, Eurographics Association (2010), 79--86.
[5]
Buchanan, S., Floyd, B., Holderness, W., and LaViola, J. J. Towards user-defined multi-touch gestures for 3D objects. ITS'13, ACM (2013), 231--240.
[6]
Cao, X., and Zhai, S. Modeling human performance of pen stroke gestures. CHI'07, ACM (2007), 1495--1504.
[7]
Chen, P., and Popovich, P. Correlation: Parametric and nonparametric measure. Thousand Oaks, CA: Sage, 2002.
[8]
Cohen, J. A power primer. Psychological Bulletin 112, 1 (1992), 155--159.
[9]
Kendall, M., and Babington Smith, B. The problem of m rankings. The Annals of Math. Stat. 10, 3 (1939), 275--287.
[10]
Long, Jr., A. C., Landay, J. A., and Rowe, L. A. Implications for a gesture design tool. CHI '99, ACM (1999), 40--47.
[11]
Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding users' preferences for surface gestures. GI'10, Canadian Inf. Proc. Soc. (2010), 261--268.
[12]
Nacenta, M. A., Kamber, Y., Qiang, Y., and Kristensson, P. O. Memorability of pre-designed and user-defined gesture sets. CHI'13, ACM (2013), 1099--1108.
[13]
Nielsen, M., Störring, M., Moeslund, T., and Granum, E. A procedure for developing intuitive and ergonomic gesture interfaces for HCI. GW'03, Springer (2004), 409--420.
[14]
Oh, U., and Findlater, L. The challenges and potential of end-user gesture customization. CHI '13 (2013), 1129--1138.
[15]
Rekik, Y., Grisoni, L., and Roussel, N. Towards many gestures to one command: A user study for tabletops. INTERACT '13, Springer-Verlag (2013), 246--263.
[16]
Rekik, Y., Vatavu, R.-D., and Grisoni, L. Match-up & conquer: A two-step technique for recognizing unconstrained bimanual and multi-finger touch input. AVI'14, ACM (2014), 201--208.
[17]
Rubine, D. Specifying gestures by example. SIGGRAPH Comput. Graph. 25, 4 (July 1991), 329--337.
[18]
Vatavu, R.-D., Anthony, L., and Wobbrock, J. O. Relative accuracy measures for stroke gestures. ICMI '13, ACM (2013), 279--286.
[19]
Vatavu, R.-D., Anthony, L., and Wobbrock, J. O. Gesture heatmaps: Understanding gesture performance with colorful visualizations. ICMI '14, ACM (2014).
[20]
Vatavu, R.-D., Vogel, D., Casiez, G., and Grisoni, L. Estimating the perceived difficulty of pen gestures. INTERACT'11, Springer-Verlag (2011), 89--106.
[21]
Willems, D., Niels, R., van Gerven, M., and Vuurpijl, L. Iconic and multi-stroke gesture recognition. Pattern Recognition 42, 12 (2009), 3303--3312.
[22]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. CHI'09, ACM (2009), 1083--1092.

Cited By

View all
  • (2024)Comparing Eyes-free Gestures to Gestures Produced in the Presence or Absence of Visual Feedback on Mobile DeviceProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656651(1-5)Online publication date: 3-Jun-2024
  • (2023)The effect of hands synchronicity on users perceived arms Fatigue in Virtual reality environmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103092178:COnline publication date: 1-Oct-2023
  • (2023)Flexible gesture input with radars: systematic literature review and taxonomy of radar sensing integration in ambient intelligence environmentsJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-023-04606-914:6(7967-7981)Online publication date: 10-Apr-2023
  • Show More Cited By

Index Terms

  1. Understanding Users' Perceived Difficulty of Multi-Touch Gesture Articulation

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    ICMI '14: Proceedings of the 16th International Conference on Multimodal Interaction
    November 2014
    558 pages
    ISBN:9781450328852
    DOI:10.1145/2663204
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 12 November 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. articulation difficulty
    2. ease of execution
    3. gesture articulation
    4. gesture geometry
    5. gesture kinematics
    6. gesture structure
    7. multi-touch
    8. number of fingers
    9. number of hands
    10. number of strokes
    11. user study

    Qualifiers

    • Poster

    Funding Sources

    Conference

    ICMI '14
    Sponsor:

    Acceptance Rates

    ICMI '14 Paper Acceptance Rate 51 of 127 submissions, 40%;
    Overall Acceptance Rate 453 of 1,080 submissions, 42%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 10 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Comparing Eyes-free Gestures to Gestures Produced in the Presence or Absence of Visual Feedback on Mobile DeviceProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656651(1-5)Online publication date: 3-Jun-2024
    • (2023)The effect of hands synchronicity on users perceived arms Fatigue in Virtual reality environmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103092178:COnline publication date: 1-Oct-2023
    • (2023)Flexible gesture input with radars: systematic literature review and taxonomy of radar sensing integration in ambient intelligence environmentsJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-023-04606-914:6(7967-7981)Online publication date: 10-Apr-2023
    • (2023)Effects of Moving Speed and Phone Location on Eyes-Free Gesture Input with Mobile DevicesHuman-Computer Interaction – INTERACT 202310.1007/978-3-031-42280-5_30(469-478)Online publication date: 25-Aug-2023
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor ImpairmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501964(1-16)Online publication date: 29-Apr-2022
    • (2021)Assessing Cognitive Test Performance Using Automatic Digital Pen Features AnalysisProceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization10.1145/3450613.3456812(33-43)Online publication date: 21-Jun-2021
    • (2021)Attentive Sequence-to-Sequence Modeling of Stroke Gestures Articulation PerformanceIEEE Transactions on Human-Machine Systems10.1109/THMS.2021.311296151:6(663-672)Online publication date: Dec-2021
    • (2021)RFID-based tangible and touch tabletop for dual reality in crisis management contextJournal on Multimodal User Interfaces10.1007/s12193-021-00370-216:1(31-53)Online publication date: 19-Mar-2021
    • (2020)Side-Crossing MenusProceedings of the ACM on Human-Computer Interaction10.1145/34273174:ISS(1-19)Online publication date: 4-Nov-2020
    • Show More Cited By

    View Options

    Get Access

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media