Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1978942.1978971acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

User-defined motion gestures for mobile interaction

Published: 07 May 2011 Publication History

Abstract

Modern smartphones contain sophisticated sensors to monitor three-dimensional movement of the device. These sensors permit devices to recognize motion gestures - deliberate movements of the device by end-users to invoke commands. However, little is known about best-practices in motion gesture design for the mobile computing paradigm. To address this issue, we present the results of a guessability study that elicits end-user motion gestures to invoke commands on a smartphone device. We demonstrate that consensus exists among our participants on parameters of movement and on mappings of motion gestures onto commands. We use this consensus to develop a taxonomy for motion gestures and to specify an end-user inspired motion gesture set. We highlight the implications of this work to the design of smartphone applications and hardware. Finally, we argue that our results influence best practices in design for all gestural interfaces.

References

[1]
Android Open Source Project. Google Inc.
[2]
Etch A Sketch. Ohio Art.
[3]
Ashbrook, D. and Starner, T. MAGIC: A Motion Gesture Design Tool. Proceedings of CHI '10, ACM (2010) 2159--2168.
[4]
Bartlett, J. F. Rock 'n' Scroll Is Here to Stay. IEEE Comput. Graph. Appl. 20, 3 (2000), 40--45.
[5]
Good, M. D., Whiteside, J. A., Wixon, D. R., and Jones, S. J. Building a user-derived interface. Communications of the ACM 27, 10 (1984), 1032--1043.
[6]
Harrison, B. L., Fishkin, K. P., Gujar, A., Mochon, C., and Want, R. Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces. Proceedings of CHI '98, ACM Press/Addison-Wesley Publishing Co. (1998), 17--24.
[7]
Harrison, C. and Hudson, S. E. Scratch input. Proceedings of UIST '08, (2008), 205.
[8]
Hartmann, B., Abdulla, L., Mittal, M., and Klemmer, S. R. Authoring sensor-based interactions by demonstration with direct manipulation and pattern recognition. Proceedings of CHI '07, (2007), 145.
[9]
Hinckley, K., Pierce, J., Sinclair, M., and Horvitz, E. Sensing techniques for mobile interaction. Proceedings of UIST '00. ACM (2000), 91--100.
[10]
Hutchins, E., Hollan, J., and Norman, D. Direct Manipulation Interfaces. Human-Computer Interact 1, 4 (1985), 311--338.
[11]
Jones, E., Alexander, J., Andreou, A., Irani, P., and Subramanian, S. GesText: Accelerometer-based Gestural Text-Entry Systems. Proceedings of CHI '10, (2010).
[12]
Liu, J., Zhong, L., Wickramasuriya, J., and Vasudevan, V. User evaluation of lightweight user authentication with a single tri-axis accelerometer. Proceedings of MobileHCI '09. ACM (2009), 1--10.
[13]
Mignot, C., Valot, C., and Carbonell, N. An experimental study of future "natural" multimodal human-computer interaction. Proceddings of INTERACT '93 and CHI '93. (1993), 67--68.
[14]
Montero, C. S., Alexander, J., Marshall, M. T., and Subramanian, S. Would you do that? Proceedings of MobileHCI '10, (2010), 275.
[15]
Morris, M. R., Huang, A., Paepcke, A., and Winograd, T. Cooperative gestures. Proceedings of CHI '06, (2006), 1201.
[16]
Morris, M. R., Wobbrock, J. O., and Wilson, A. D. Understanding users' preferences for surface gestures. Proceedings of GI 2010, CIPS (2010), 261--268.
[17]
Partridge, K., Chatterjee, S., Sazawal, V., Borriello, G., and Want, R. TiltType: accelerometer-supported text entry for very small devices. Proc. UIST '02, ACM (2002), 201--204.
[18]
Rekimoto, J. Tilting operations for small screen interfaces. Proceedings of UIST '96, ACM (1996), 167--168.
[19]
Rico, J. and Brewster, S. Usable gestures for mobile interfaces: evaluating social acceptability. Proceedings of CHI '10, ACM (2010), 887--896.
[20]
Ruiz, J. and Li, Y. DoubleFlip: A Motion Gesture for Mobile Interaction. Proceedings of CHI '11, ACM (2011).
[21]
Schuler, D. Participatory design : principles and practices. L. Erlbaum Associates, Hillsdale N.J., 1993.
[22]
Small, D. and Ishii, H. Design of spatially aware graspable displays. CHI '97 extended abstracts, ACM (1997), 367--368.
[23]
Tang, J. Findings from observational studies of collaborative work. International Journal of Man-Machine Studies 34, 2 (1991), 143--160.
[24]
Voida, S., Podlaseck, M., Kjeldsen, R., and Pinhanez, C. A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment. Proceedings of CHI '05, (2005), 611.
[25]
Weberg, L., Brange, T., and Hansson, W. A piece of butter on the PDA display. CHI '01 extended abstracts, ACM (2001), 435--436.
[26]
Wigdor, D. and Balakrishnan, R. TiltText: using tilt for text input to mobile phones. Proc UIST '03, ACM (2003), 81--90.
[27]
Wobbrock, J. O., Aung, H. H., Rothrock, B., and Myers, B. A. Maximizing the guessability of symbolic input. CHI '05 extended abstracts, (2005), 1869.
[28]
Wobbrock, J. O., Morris, M. R., and Wilson, A. D. User-defined gestures for surface computing. Proceedings of CHI '09, (2009), 1083.
[29]
Wu, M., Chia Shen, Ryall, K., Forlines, C., and Balakrishnan, R. Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces. Proceedings of TABLETOP '06, 185--192.
[30]
Yee, K. Peephole displays. Proceedings of CHI '03, ACM (2003), 1--8.

Cited By

View all
  • (2024)Assessing the Acceptance of a Mid-Air Gesture Syntax for Smart Space Interaction: An Empirical StudyJournal of Sensor and Actuator Networks10.3390/jsan1302002513:2(25)Online publication date: 9-Apr-2024
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and ClassificationCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3661331(57-65)Online publication date: 24-Jun-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '11: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
May 2011
3530 pages
ISBN:9781450302289
DOI:10.1145/1978942
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 May 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. mobile interaction
  2. motion gestures
  3. sensors

Qualifiers

  • Research-article

Conference

CHI '11
Sponsor:

Acceptance Rates

CHI '11 Paper Acceptance Rate 410 of 1,532 submissions, 27%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)266
  • Downloads (Last 6 weeks)29
Reflects downloads up to 22 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Assessing the Acceptance of a Mid-Air Gesture Syntax for Smart Space Interaction: An Empirical StudyJournal of Sensor and Actuator Networks10.3390/jsan1302002513:2(25)Online publication date: 9-Apr-2024
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and ClassificationCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3661331(57-65)Online publication date: 24-Jun-2024
  • (2024)User Preferences for Interactive 3D Object Transitions in Cross Reality - An Elicitation StudyProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656698(1-9)Online publication date: 3-Jun-2024
  • (2024)Designing Upper-Body Gesture Interaction with and for People with Spinal Muscular Atrophy in VRProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642884(1-19)Online publication date: 11-May-2024
  • (2024)“Can It Be Customized According to My Motor Abilities?”: Toward Designing User-Defined Head Gestures for People with DystoniaProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642378(1-11)Online publication date: 11-May-2024
  • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
  • (2024)A Dynamic Gesture Recognition Algorithm Using Single Halide Perovskite Photovoltaic Cell for Human-Machine Interaction2024 International Conference on Electronics, Information, and Communication (ICEIC)10.1109/ICEIC61013.2024.10457182(1-4)Online publication date: 28-Jan-2024
  • (2024)User-Defined Interactions for Visual Data Exploration With the Combination of Smartwatch and Large DisplayIEEE Access10.1109/ACCESS.2024.340487612(78657-78679)Online publication date: 2024
  • (2024)Innovative Interaction Mode in VR GamesFrontier Computing on Industrial Applications Volume 410.1007/978-981-99-9342-0_9(77-86)Online publication date: 21-Jan-2024
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media