Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2639189.2639210acmotherconferencesArticle/Chapter ViewAbstractPublication PagesnordichiConference Proceedingsconference-collections
research-article

Belly gestures: body centric gestures on the abdomen

Published: 26 October 2014 Publication History

Abstract

Recent HCI research has shown that the body offers an interactive surface particularly suitable to eyes-free interaction. While researchers have mainly focused on the arms and the hands, we argue that the surface of the belly is especially appropriate. The belly offers a fairly large surface that can be easily reached with the two hands in any circumstance, including walking or running. We report on a study that explored how users perform one-handed gestures on their abdomen. Users use different mental spatial orientations depending on the complexity of the gesture they have to draw (drawing a digit vs. a simple directional stroke). When provided with no visual orientation cues they often draw gestures following symmetries relative to a horizontal or vertical axis. The more complex the gesture, the less stability in orientation. Focusing on directional strokes, we found that users are able to draw almost linear gestures, despite the fact that the abdomen is not perfectly planar, and perform particularly well in cardinal directions. The paper ends up with some guidelines that may inform the design of novel interaction techniques.

References

[1]
Ashbrook, D. L. Enabling mobile microinteractions, PhD Thesis, Georgia Institute of Technology (2010).
[2]
Bailly, G., Müller, J., Rohs, M., Wigdor, D., and Kratz, S. ShoeSense: a new perspective on gestural interaction and wearable applications. In Proc. CHI 2012, ACM Press (2012), 1239--1248.
[3]
Forlines, C., Wigdor, D., Shen, C., and Balakrishnan, R. Direct-touch vs. mouse input for tabletop displays. Proc. CHI 2007, ACM Press (2007), 647--656.
[4]
Gallace, A. and Spence, C. The science of interpersonal touch: an overview. In Neuroscience and biobehavioral reviews 34, 2 (2010), 246--59.
[5]
Gemperle, F., Kasabach, C., Stivoric, J., Bauer, M., and Martin, R. Design for wearability. Digest of Papers. ISWC 1998, IEEE Comput. Soc (1998), 116--122.
[6]
Gentaz, E., Baud-Bovy, G., and Luyat, M. The haptic perception of spatial orientations. In Experimental brain research. 187, 3 (2008), 331--48.
[7]
Gustafson, S., Holz, C., and Baudisch, P. Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device. In Proc. UIST 2011, ACM Press (2011), 283--292.
[8]
Harrison, C., Benko, H., and Wilson, A. D. OmniTouch: wearable multitouch interaction everywhere. In Proc. UIST 2011, ACM Press (2011), 441--450.
[9]
Harrison, C., Tan, D., and Morris, D. Skinput: appropriating the body as an input surface. In Proc. CHI 2010, ACM Press (2010), 453--462.
[10]
Hudson, S. E., Harrison, C., Harrison, B. L., and LaMarca, A. Whack gestures. In Proc. TEI 2010, ACM Press (2010), 109--112.
[11]
Karrer, T., Wittenhagen, M., and Lichtschlag, L. Pinstripe: eyes-free continuous input on interactive clothing. In Proc. CHI 2011, ACM Press (2011), 1313--1322.
[12]
Krebs, D. E., Wong, D., Jevsevar, D., Riley, P. O., and Hodge, W. a. Trunk kinematics during locomotor activities. In Physical therapy 72, 7 (1992), 505--14.
[13]
Kurtenbach, G. and Buxton, W. Issues in combining marking and direct manipulation techniques. In Proc. UIST 1991, ACM Press (1991), 137--144.
[14]
Lin, S.-Y., Su, C.-H., Cheng, K.-Y., Liang, R.-H., Kuo, T.-H., and Chen, B.-Y. Pub - point upon body: exploring eyes-free interaction and methods on an arm. In Proc. UIST 2011, ACM Press (2011), 481--488.
[15]
Mergner, T. and Rosemeier, T. Interaction of vestibular, somatosensory and visual signals for postural control and motion perception under terrestrial and microgravity conditions--a conceptual model. In Brain research. Brain research reviews 28, 1-2 (1998), 118--35.
[16]
Nakatsuma, K., Shinoda, H., Makino, Y., Sato, K., and Maeno, T. Touch interface on back of the hand. In ACM SIGGRAPH 2011, ACM Press (2011), 19.
[17]
Oulasvirta, A., Tamminen, S., Roto, V., and Kuorelahti, J. Interaction in 4-second bursts. In Proc. CHI 2005, ACM Press (2005), 919.
[18]
Parsons, L. M. and Shimojo, S. Perceived spatial organization of cutaneous patterns on surfaces of the human body in various positions. In Journal of experimental psychology: Human perception and performance 13, 3 (1987), 488--504.
[19]
Pedersen, E. W. and Hornbæk, K. An experimental comparison of touch interaction on vertical and horizontal surfaces. In Proc. NordiCHI 2012, ACM Press (2012), 370--379.
[20]
Proske, U. and Gandevia, S. C. The proprioceptive senses: their roles in signaling body shape, body position and movement, and muscle force. Physiological reviews 92, 4 (2012), 1651--97.
[21]
Rico, J. and Brewster, S. Usable gestures for mobile interfaces: evaluating social acceptability. In Proc. CHI 2010, ACM Press (2010), 887--896.
[22]
Roudaut, A., Pohl, H., and Baudisch, P. Touch input on curved surfaces. In Proc. CHI 2011, ACM Press (2011), 1011--1020.
[23]
Royer, F. L., Gilmore, G. C. and Gruhn, J. J., Normative data for the symbol digit substitution task. J. Clin. Psychol. (1981), 37: 608--614.
[24]
Sato, M., Poupyrev, I., and Harrison, C. Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects. In Proc. CHI 2012, ACM Press (2012), 483--492.
[25]
Serino, A. and Haggard, P. Touch and the body. Neuroscience and biobehavioral reviews 34, 2 (2010), 224--36.
[26]
Voelker, S., Sutter, C., Wang, L., and Borchers, J. Understanding flicking on curved surfaces. In Proc. CHI 2012, (2012), 189--198.
[27]
Wagner, J., Nancel, M., Gustafson, S. G., Huot, S., and Mackay, W. E. Body-centric design space for multi-surface interaction. In Proc. CHI 2013, ACM Press (2013), 1299--1308.
[28]
Wobbrock J. O., Aung H. H., Rothrock B., and Myers B. A. Maximizing the guessability of symbolic input. In CHI EA 2005. ACM Press (2005), 1869--1872.
[29]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. In Proc CHI 2009, ACM Press (2009), 1083--1092.
[30]
Yang, X., Grossman, T., Wigdor, D., and Fitzmaurice, G. Magic finger: always-available input through finger instrumentation. In Proc. UIST 2012, ACM Press (2012), 147--156.

Cited By

View all
  • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
  • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
  • (2023)LapTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108787:3(1-23)Online publication date: 27-Sep-2023
  • Show More Cited By

Index Terms

  1. Belly gestures: body centric gestures on the abdomen

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    NordiCHI '14: Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational
    October 2014
    361 pages
    ISBN:9781450325424
    DOI:10.1145/2639189
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    In-Cooperation

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 26 October 2014

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. abdomen
    2. belly
    3. gestural interaction
    4. gestures
    5. on-body interaction
    6. spatial orientation

    Qualifiers

    • Research-article

    Funding Sources

    • OSEO

    Conference

    NordiCHI '14

    Acceptance Rates

    NordiCHI '14 Paper Acceptance Rate 89 of 361 submissions, 25%;
    Overall Acceptance Rate 379 of 1,572 submissions, 24%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)24
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 08 Feb 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)BodyTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314267:4(1-22)Online publication date: 12-Jan-2024
    • (2023)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 7-Dec-2023
    • (2023)LapTouchProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36108787:3(1-23)Online publication date: 27-Sep-2023
    • (2023)Augmenting On-Body Touch Input with Tactile Feedback Through Fingernail HapticsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581473(1-13)Online publication date: 19-Apr-2023
    • (2023)Tailor Twist: Assessing Rotational Mid-Air Interactions for Augmented RealityProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581461(1-14)Online publication date: 19-Apr-2023
    • (2023)Exploring user-defined gestures for lingual and palatal interactionJournal on Multimodal User Interfaces10.1007/s12193-023-00408-717:3(167-185)Online publication date: 10-Aug-2023
    • (2022)The sense of agency in emerging technologies for human–computer integration: A reviewFrontiers in Neuroscience10.3389/fnins.2022.94913816Online publication date: 12-Sep-2022
    • (2022)Iteratively Designing Gesture Vocabularies: A Survey and Analysis of Best Practices in the HCI LiteratureACM Transactions on Computer-Human Interaction10.1145/350353729:4(1-54)Online publication date: 5-May-2022
    • (2022)Next Steps in Epidermal Computing: Opportunities and Challenges for Soft On-Skin DevicesProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3517668(1-22)Online publication date: 29-Apr-2022
    • (2020)A Gesture Elicitation Study of Nose-Based GesturesSensors10.3390/s2024711820:24(7118)Online publication date: 11-Dec-2020
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media