Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3544549.3585695acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
Work in Progress

An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly Deictic

Published: 19 April 2023 Publication History
  • Get Citation Alerts
  • Abstract

    We present empirical results about gesture expressivity and articulation complexity from an analysis of 231 gestures elicited from eleven wheelchair users, for which we employ a combination of McNeill’s gesture theory from psycholinguistics and a taxonomy used in Human-Computer Interaction for the analysis of gestures elicited from end users. We report that 53.7% of the gestures that we analyzed were deictic in nature, and 50.7% were performed toward the body. These findings suggest a potential tradeoff between the expressivity of iconic and metaphoric gestures, less represented in the gesture set analyzed in this work, for the low complexity of simple pointing movements performed from the wheelchair space. Our results complement findings of previous gesture elicitation studies conducted with users with motor and/or mobility impairments, and suggest future work opportunities for gesture input performed from the wheelchair space, including mixed-nature gestures that feature both low complexity and rich expressivity.

    Supplementary Material

    MP4 File (3544549.3585695-talk-video.mp4)
    Pre-recorded Video Presentation

    References

    [1]
    Reuben M. Aronson, Nadia Almutlak, and Henny Admoni. 2021. Inferring Goals with Gaze during Teleoperated Manipulation. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS ’21). IEEE, USA, 7307–7314. https://doi.org/10.1109/IROS51168.2021.9636551
    [2]
    Yuki Asai, Yuta Ueda, Ryuichi Enomoto, Daisuke Iwai, and Kosuke Sato. 2016. ExtendedHand on Wheelchair. In Adjunct Proceedings of the 29th Annual ACM Symposium on User Interface Software and Technology(UIST ’16 Adjunct). ACM, New York, NY, USA, 147–148. https://doi.org/10.1145/2984751.2985738
    [3]
    Laura-Bianca Bilius, Ovidiu-Ciprian Ungurean, and Radu-Daniel Vatavu. 2023. Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair Gestures. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems(CHI ’23). ACM, New York, NY, USA, 16 pages. https://doi.org/10.1145/3544548.3580929
    [4]
    Xiang Cao and Shumin Zhai. 2007. Modeling Human Performance of Pen Stroke Gestures. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’07). ACM, New York, NY, USA, 1495–1504. https://doi.org/10.1145/1240624.1240850
    [5]
    Baptiste Caramiaux, Nicola Montecchio, Atau Tanaka, and Frédéric Bevilacqua. 2014. Adaptive Gesture Recognition with Variation Estimation for Interactive Systems. ACM Trans. Interact. Intell. Syst. 4, 4, Article 18 (2014), 34 pages. https://doi.org/10.1145/2643204
    [6]
    Patrick Carrington, Jian-Ming Chang, Kevin Chang, Catherine Hornback, Amy Hurst, and Shaun K. Kane. 2016. The Gest-Rest Family: Exploring Input Possibilities for Wheelchair Armrests. ACM Trans. Accessible Computing 8, 3, Article 12 (2016), 24 pages. https://doi.org/10.1145/2873062
    [7]
    Patrick Carrington, Amy Hurst, and Shaun K. Kane. 2013. How Power Wheelchair Users Choose Computing Devices. In Proceedings of the 15th International ACM SIGACCESS Conference on Computers and Accessibility(ASSETS ’13). ACM, New York, NY, USA, Article 52, 2 pages. https://doi.org/10.1145/2513383.2513426
    [8]
    Patrick Carrington, Amy Hurst, and Shaun K. Kane. 2014. Wearables and Chairables: Inclusive Design of Mobile Input and Output Techniques for Power Wheelchair Users. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’14). ACM, New York, NY, USA, 3103–3112. https://doi.org/10.1145/2556288.2557237
    [9]
    Chris Creed and Russell Beale. 2014. Enhancing Multi-Touch Table Accessibility for Wheelchair Users. In Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility(ASSETS ’14). ACM, New York, NY, USA, 255–256. https://doi.org/10.1145/2661334.2661388
    [10]
    Leah Findlater, Alex Jansen, Kristen Shinohara, Morgan Dixon, Peter Kamb, Joshua Rakita, and Jacob O. Wobbrock. 2010. Enhanced Area Cursors: Reducing Fine Pointing Demands for People with Motor Impairments. In Proceedings of the 23nd Annual ACM Symposium on User Interface Software and Technology(UIST ’10). ACM, New York, NY, USA, 153–162. https://doi.org/10.1145/1866029.1866055
    [11]
    Krzysztof Z. Gajos, Jacob O. Wobbrock, and Daniel S. Weld. 2007. Automatically Generating User Interfaces Adapted to Users’ Motor and Vision Capabilities. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology(UIST ’07). ACM, New York, NY, USA, 231–240. https://doi.org/10.1145/1294211.1294253
    [12]
    Giulia Galli, Jean-Paul Noel, Elisa Canzoneri, Olaf Blanke, and Andrea Serino. 2015. The Wheelchair as a Full-Body Tool Extending the Peripersonal Space. Frontiers in Psychology 6 (2015), 639:1–639:11. https://doi.org/10.3389/fpsyg.2015.00639
    [13]
    Kathrin Gerling, Patrick Dickinson, Kieran Hicks, Liam Mason, Adalberto L. Simeone, and Katta Spiel. 2020. Virtual Reality Games for People Using Wheelchairs. In Proceedings of the CHI Conference on Human Factors in Computing Systems(CHI ’20). ACM, New York, NY, USA, 1–11. https://doi.org/10.1145/3313831.3376265
    [14]
    Kilem Li Gwet. 2010. Computing Inter-Rater Reliability and Its Variance in the Presence of High Agreement. Brit. J. Math. Statist. Psych. 61 (2010), 29–48. https://doi.org/10.1348/000711006X126600
    [15]
    Max Hildebrand, Frederik Bonde, Rasmus Vedel Nonboe Kobborg, Christian Andersen, Andreas Flem Norman, Mikkel Thøgersen, Stefan Hein Bengtson, Strahinja Dosen, and Lotte N. S. Andreasen Struijk. 2019. Semi-Autonomous Tongue Control of an Assistive Robotic Arm for Individuals with Quadriplegia. In Proceedings of the 16th IEEE International Conference on Rehabilitation Robotics(ICORR ’19). IEEE, USA, 157–162. https://doi.org/10.1109/ICORR.2019.8779457
    [16]
    Ananda Sankar Kundu, Oishee Mazumder, Prasanna Kumar Lenka, and Subhasis Bhaumik. 2018. Hand Gesture Recognition Based Omnidirectional Wheelchair Control Using IMU and EMG Sensors. J. Intell. Robotics Syst. 91, 3–4 (sep 2018), 529–541. https://doi.org/10.1007/s10846-017-0725-0
    [17]
    Mohammed Kutbi, Xiaoxue Du, Yizhe Chang, Bo Sun, Nikolaos Agadakos, Haoxiang Li, Gang Hua, and Philippos Mordohai. 2020. Usability Studies of an Egocentric Vision-Based Robotic Wheelchair. J. Hum.-Robot Interact. 10, 1, Article 4 (jul 2020), 23 pages. https://doi.org/10.1145/3399434
    [18]
    Luis A. Leiva, Daniel Martín-Albo, Réjean Plamondon, and Radu-Daniel Vatavu. 2018. KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times. In Proceedings of the CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, Article 239, 12 pages. https://doi.org/10.1145/3173574.3173813
    [19]
    Meethu Malu, Pramod Chundury, and Leah Findlater. 2018. Exploring Accessible Smartwatch Interactions for People with Upper Body Motor Impairments. In Proceedings of the CHI Conference on Human Factors in Computing Systems(CHI ’18). ACM, New York, NY, USA, Article 488, 12 pages. https://doi.org/10.1145/3173574.3174062
    [20]
    Meethu Malu and Leah Findlater. 2015. Personalized, Wearable Control of a Head-Mounted Display for Users with Upper Body Motor Impairments. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems(CHI ’15). ACM, New York, NY, USA, 221–230. https://doi.org/10.1145/2702123.2702188
    [21]
    David McNeill. 1992. Hand and Mind: What Gestures Reveal About Thought. Vol. 27. University of Chicago Press, Chicago. https://psycnet.apa.org/record/1992-98214-000
    [22]
    Meredith Ringel Morris, Andreea Danielescu, Steven Drucker, Danyel Fisher, Bongshin Lee, m. c. schraefel, and Jacob O. Wobbrock. 2014. Reducing Legacy Bias in Gesture Elicitation Studies. Interactions 21, 3 (may 2014), 40–45. https://doi.org/10.1145/2591689
    [23]
    David M. Roy, Marilyn Panayi, Roman Erenshteyn, Richard Foulds, and Robert Fawcus. 1994. Gestural Human-Machine Interaction for People with Severe Speech and Motor Impairment Due to Cerebral Palsy. In Conference Companion on Human Factors in Computing Systems(CHI ’94). ACM, New York, NY, USA, 313–314. https://doi.org/10.1145/259963.260375
    [24]
    Quentin Roy, Sylvain Malacria, Yves Guiard, Eric Lecolinet, and James Eagan. 2013. Augmented Letters: Mnemonic Gesture-Based Shortcuts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’13). ACM, New York, NY, USA, 2325–2328. https://doi.org/10.1145/2470654.2481321
    [25]
    Ovidiu-Andrei Schipor, Laura-Bianca Bilius, and Radu-Daniel Vatavu. 2022. WearSkill: Personalized and Interchangeable Input with Wearables for Users with Motor Impairments. In Proceedings of the 19th International Web for All Conference(W4A ’22). ACM, New York, NY, USA, Article 10, 5 pages. https://doi.org/10.1145/3493612.3520455
    [26]
    Ovidiu-Andrei Schipor and Radu-Daniel Vatavu. 2018. Invisible, Inaudible, and Impalpable: Users’ Preferences and Memory Performance for Digital Content in Thin Air. IEEE Pervasive Computing 17, 04 (2018), 76–85. https://doi.org/10.1109/MPRV.2018.2873856
    [27]
    Zhida Sun, Sitong Wang, Chengzhong Liu, and Xiaojuan Ma. 2022. Metaphoraction: Support Gesture-Based Interaction Design with Metaphorical Meanings. ACM Trans. Comput.-Hum. Interact. 29, 5, Article 45 (oct 2022), 33 pages. https://doi.org/10.1145/3511892
    [28]
    Katherine M. Tsui, Dae-Jin Kim, Aman Behal, David Kontak, and Holly A. Yanco. 2011. “I Want That”: Human-in-the-Loop Control of a Wheelchair-Mounted Robotic Arm. Appl. Bionics Biomechanics 8, 1 (jan 2011), 127–147.
    [29]
    Ovidiu-Ciprian Ungurean and Radu-Daniel Vatavu. 2021. Coping, Hacking, and DIY: Reframing the Accessibility of Interactions with Television for People with Motor Impairments. In Proceedings of the ACM International Conference on Interactive Media Experiences(IMX ’21). ACM, New York, NY, USA, 37–49. https://doi.org/10.1145/3452918.3458802
    [30]
    Ovidiu-Ciprian Ungurean and Radu-Daniel Vatavu. 2022. "I Gave up Wearing Rings:" Insights on the Perceptions and Preferences of Wheelchair Users for Interactions With Wearables. IEEE Pervasive Computing 21, 3 (2022), 92–101. https://doi.org/10.1109/MPRV.2022.3155952
    [31]
    Radu-Daniel Vatavu and Ovidiu-Ciprian Ungurean. 2022. Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor Impairments. In Proceedings of the CHI Conference on Human Factors in Computing Systems(CHI ’22). ACM, New York, NY, USA, Article 2, 16 pages. https://doi.org/10.1145/3491102.3501964
    [32]
    Radu-Daniel Vatavu, Ovidiu-Ciprian Ungurean, and Laura-Bianca Bilius. 2022. Interactive Public Displays and Wheelchair Users: Between Direct, Personal and Indirect, Assisted Interaction. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology(UIST ’22). ACM, New York, NY, USA, Article 45, 17 pages. https://doi.org/10.1145/3526113.3545662
    [33]
    Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the Perceived Difficulty of Pen Gestures. In Proceedings of the 13th IFIP TC 13 International Conference on Human-Computer Interaction(INTERACT’11). Springer-Verlag, Berlin, Heidelberg, 89–106.
    [34]
    Radu-Daniel Vatavu and Jacob O. Wobbrock. 2022. Clarifying Agreement Calculations and Analysis for End-User Elicitation Studies. ACM Trans. Comput.-Hum. Interact. 29, 1, Article 5 (jan 2022), 70 pages. https://doi.org/10.1145/3476101
    [35]
    Santiago Villarreal-Narvaez, Jean Vanderdonckt, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2020. A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?. In Proceedings of the ACM Designing Interactive Systems Conference. ACM, New York, NY, USA, 855–872. https://doi.org/10.1145/3357236.3395511
    [36]
    Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems(CHI EA ’05). ACM, New York, NY, USA, 1869–1872. https://doi.org/10.1145/1056808.1057043
    [37]
    Jacob O. Wobbrock, Krzysztof Z. Gajos, Shaun K. Kane, and Gregg C. Vanderheiden. 2018. Ability-Based Design. Commun. ACM 61, 6 (may 2018), 62–71. https://doi.org/10.1145/3148051
    [38]
    Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-Defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems(CHI ’09). ACM, New York, NY, USA, 1083–1092. https://doi.org/10.1145/1518701.1518866
    [39]
    Jacob O. Wobbrock, Brad A. Myers, Htet Htet Aung, and Edmund F. LoPresti. 2003. Text Entry from Power Wheelchairs: Edgewrite for Joysticks and Touchpads. In Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility(ASSETS ’04). ACM, New York, NY, USA, 110–117. https://doi.org/10.1145/1028630.1028650
    [40]
    Xuan Zhao, Mingming Fan, and Teng Han. 2022. “I Don’t Want People to Look At Me Differently”: Designing User-Defined Above-the-Neck Gestures for People with Upper Body Motor Impairments. In Proceedings of the CHI Conference on Human Factors in Computing Systems(CHI ’22). ACM, New York, NY, USA, Article 1, 15 pages. https://doi.org/10.1145/3491102.3517552

    Cited By

    View all
    • (2024)ChairMX: On-Chair Input for Interactive Media Consumption Experiences for Everyone, EverywhereProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3661090(447-451)Online publication date: 7-Jun-2024

    Index Terms

    1. An Expressivity-Complexity Tradeoff?: User-Defined Gestures from the Wheelchair Space are Mostly Deictic

          Recommendations

          Comments

          Information & Contributors

          Information

          Published In

          cover image ACM Conferences
          CHI EA '23: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems
          April 2023
          3914 pages
          ISBN:9781450394222
          DOI:10.1145/3544549
          Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

          Sponsors

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          Published: 19 April 2023

          Check for updates

          Author Tags

          1. Gesture input
          2. deictic gestures
          3. gesture elicitation
          4. iconic gestures
          5. mobility impairments
          6. motor impairments
          7. wheelchair users

          Qualifiers

          • Work in progress
          • Research
          • Refereed limited

          Funding Sources

          Conference

          CHI '23
          Sponsor:

          Acceptance Rates

          Overall Acceptance Rate 6,164 of 23,696 submissions, 26%

          Upcoming Conference

          CHI PLAY '24
          The Annual Symposium on Computer-Human Interaction in Play
          October 14 - 17, 2024
          Tampere , Finland

          Contributors

          Other Metrics

          Bibliometrics & Citations

          Bibliometrics

          Article Metrics

          • Downloads (Last 12 months)74
          • Downloads (Last 6 weeks)6

          Other Metrics

          Citations

          Cited By

          View all
          • (2024)ChairMX: On-Chair Input for Interactive Media Consumption Experiences for Everyone, EverywhereProceedings of the 2024 ACM International Conference on Interactive Media Experiences10.1145/3639701.3661090(447-451)Online publication date: 7-Jun-2024

          View Options

          Get Access

          Login options

          View options

          PDF

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          Full Text

          View this article in Full Text.

          Full Text

          HTML Format

          View this article in HTML Format.

          HTML Format

          Media

          Figures

          Other

          Tables

          Share

          Share

          Share this Publication link

          Share on social media