Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1518701.1518866acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

User-defined gestures for surface computing

Published: 04 April 2009 Publication History

Abstract

Many surface computing prototypes have employed gestures created by system designers. Although such gestures are appropriate for early investigations, they are not necessarily reflective of user behavior. We present an approach to designing tabletop gestures that relies on eliciting gestures from non-technical users by first portraying the effect of a gesture, and then asking users to perform its cause. In all, 1080 gestures from 20 participants were logged, analyzed, and paired with think-aloud data for 27 commands performed with 1 and 2 hands. Our findings indicate that users rarely care about the number of fingers they employ, that one hand is preferred to two, that desktop idioms strongly influence users' mental models, and that some commands elicit little gestural agreement, suggesting the need for on-screen widgets. We also present a complete user-defined gesture set, quantitative agreement scores, implications for surface technology, and a taxonomy of surface gestures. Our results will help designers create better gesture sets informed by user behavior.

References

[1]
Baudel, T. and Beaudouin-Lafon, M. (1993) Charade: Remote control of objects using free-hand gestures. Communications of the ACM 36 (7), 28--35.
[2]
Beringer, N. (2002) Evoking gestures in SmartKom - Design of the graphical user interface. Int'l Gesture Workshop 2001, LNCS vol. 2298. Heidelberg: Springer-Verlag, 228--240.
[3]
Dietz, P. and Leigh, D. (2001) DiamondTouch: A multi-user touch technology. Proc. UIST '01. New York: ACM Press, 219--226.
[4]
Efron, D. (1941) Gesture and Environment. Morningside Heights, New York: King's Crown Press.
[5]
Epps, J., Lichman, S. and Wu, M. (2006) A study of hand shape use in tabletop gesture interaction. Ext. Abstracts CHI '06. New York: ACM Press, 748--753.
[6]
Foley, J.D., van Dam, A., Feiner, S.K. and Hughes, J.F. (1996) The form and content of user-computer dialogues. In Computer Graphics: Principles and Practice. Reading, MA: Addison-Wesley, 392--395.
[7]
Forlines, C., Esenther, A., Shen, C., Wigdor, D. and Ryall, K. (2006) Multi-user, multi-display interaction with a single-user, single-display geospatial application. Proc. UIST '06. New York: ACM Press, 273--276.
[8]
Furnas, G.W., Landauer, T.K., Gomez, L.M. and Dumais, S.T. (1987) The vocabulary problem in human-system communication. Communications of the ACM 30 (11), 964--971.
[9]
Good, M.D., Whiteside, J.A., Wixon, D.R. and Jones, S.J. (1984) Building a user-derived interface. Communications of the ACM 27 (10), 1032--1043.
[10]
Hutchins, E.L., Hollan, J.D. and Norman, D.A. (1985) Direct manipulation interfaces. Human-Computer Interaction 1 (4), 311--388.
[11]
Kendon, A. (1988) How gestures can become like words. In Crosscultural Perspectives in Nonverbal Communication, F. Poyatos (ed). Toronto: C. J. Hogrefe, 131--141.
[12]
Liu, J., Pinelle, D., Sallam, S., Subramanian, S. and Gutwin, C. (2006) TNT: Improved rotation and translation on digital tables. Proc. GI '06. Toronto: CIPS, 25--32.
[13]
Long, A.C., Landay, J.A. and Rowe, L.A. (1999) Implications for a gesture design tool. Proc. CHI '99. New York: ACM Press, 40--47.
[14]
Malik, S., Ranjan, A. and Balakrishnan, R. (2005) Interacting with large displays from a distance with vision-tracked multi-finger gestural input. Proc. UIST '05. New York: ACM Press, 43--52.
[15]
McNeill, D. (1992) Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press.
[16]
Mignot, C., Valot, C. and Carbonell, N. (1993) An experimental study of future 'natural' multimodal human-computer interaction. Conference Companion INTERCHI '93. New York: ACM Press, 67--68.
[17]
Morris, M.R., Huang, A., Paepcke, A. and Winograd, T. (2006) Cooperative gestures: Multi-user gestural interactions for co-located groupware. Proc. CHI '06. New York: ACM Press, 1201--1210.
[18]
Moscovich, T. and Hughes, J.F. (2006) Multi-finger cursor techniques. Proc. GI '06. Toronto: CIPS, 1--7.
[19]
Nielsen, M., Störring, M., Moeslund, T.B. and Granum, E. (2004) A procedure for developing intuitive and ergonomic gesture interfaces for HCI. Int'l Gesture Workshop 2003, LNCS vol. 2915. Heidelberg: Springer-Verlag, 409--420.
[20]
Poggi, I. (2002) From a typology of gestures to a procedure for gesture production. Int'l Gesture Workshop 2001, LNCS vol. 2298. Heidelberg: Springer-Verlag, 158--168.
[21]
Rekimoto, J. (2002) SmartSkin: An infrastructure for freehand manipulation on interactive surfaces. Proc. CHI '02. New York: ACM Press, 113--120.
[22]
Robbe-Reiter, S., Carbonell, N. and Dauchy, P. (2000) Expression constraints in multimodal human-computer interaction. Proc. IUI '00. New York: ACM Press, 225--228.
[23]
Robbe, S. (1998) An empirical study of speech and gesture interaction: Toward the definition of ergonomic design guidelines. Conference Summary CHI '98. New York: ACM Press, 349--350.
[24]
Rossini, N. (2004) The analysis of gesture: Establishing a set of parameters. Int'l Gesture Workshop 2003, LNCS vol. 2915. Heidelberg: Springer-Verlag, 124--131.
[25]
Schuler, D. and Namioka, A. (1993) Participatory Design: Principles and Practices. Hillsdale, NJ: Lawrence Erlbaum.
[26]
Tang, J.C. (1991) Findings from observational studies of collaborative work. Int'l J. Man-Machine Studies 34 (2), 143--160.
[27]
Tse, E., Shen, C., Greenberg, S. and Forlines, C. (2006) Enabling interaction with single user applications through speech and gestures on a multi-user tabletop. Proc. AVI '06. New York: ACM Press, 336--343.
[28]
Voida, S., Podlaseck, M., Kjeldsen, R. and Pinhanez, C. (2005) A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment. Proc. CHI '05. New York: ACM Press, 611--620.
[29]
Wellner, P. (1993) Interacting with paper on the DigitalDesk. Communications of the ACM 36 (7), 87--96.
[30]
Wigdor, D., Leigh, D., Forlines, C., Shipman, S., Barnwell, J., Balakrishnan, R. and Shen, C. (2006) Under the table interaction. Proc. UIST '06. New York: ACM Press, 259--268.
[31]
Wilson, A.D. (2005) PlayAnywhere: A compact interactive tabletop projection-vision system. Proc. UIST '05. New York: ACM Press, 83--92.
[32]
Wilson, A.D., Izadi, S., Hilliges, O., Garcia-Mendoza, A. and Kirk, D. (2008) Bringing physics to the surface. Proc. UIST '08. New York: ACM Press, 67--76.
[33]
Wobbrock, J.O., Aung, H.H., Rothrock, B. and Myers, B.A. (2005) Maximizing the guessability of symbolic input. Ext. Abstracts CHI '05. New York: ACM Press, 1869--1872.
[34]
Wu, M. and Balakrishnan, R. (2003) Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays. Proc. UIST '03. New York: ACM Press, 193--202.
[35]
Wu, M., Shen, C., Ryall, K., Forlines, C. and Balakrishnan, R. (2006) Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. Proc. TableTop '06. Washington, D.C.: IEEE Computer Society, 185--192.

Cited By

View all
  • (2024)Assessing the Acceptance of a Mid-Air Gesture Syntax for Smart Space Interaction: An Empirical StudyJournal of Sensor and Actuator Networks10.3390/jsan1302002513:2(25)Online publication date: 9-Apr-2024
  • (2024)Designing Gestures for Data Exploration with Public Displays via Identification StudiesInformation10.3390/info1506029215:6(292)Online publication date: 21-May-2024
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '09: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2009
2426 pages
ISBN:9781605582467
DOI:10.1145/1518701
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 April 2009

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. gesture recognition
  2. gestures
  3. guessability
  4. referents
  5. signs
  6. surface
  7. tabletop
  8. think-aloud

Qualifiers

  • Research-article

Conference

CHI '09
Sponsor:

Acceptance Rates

CHI '09 Paper Acceptance Rate 277 of 1,130 submissions, 25%;
Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)790
  • Downloads (Last 6 weeks)134
Reflects downloads up to 17 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Assessing the Acceptance of a Mid-Air Gesture Syntax for Smart Space Interaction: An Empirical StudyJournal of Sensor and Actuator Networks10.3390/jsan1302002513:2(25)Online publication date: 9-Apr-2024
  • (2024)Designing Gestures for Data Exploration with Public Displays via Identification StudiesInformation10.3390/info1506029215:6(292)Online publication date: 21-May-2024
  • (2024)Exploring User-Defined Gestures as Input for Hearables and Recognizing Ear-Level Gestures with IMUsProceedings of the ACM on Human-Computer Interaction10.1145/36765038:MHCI(1-23)Online publication date: 24-Sep-2024
  • (2024)Enabling Safer Augmented Reality Experiences: Usable Privacy Interventions for AR Creators and End-UsersAdjunct Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3672539.3686708(1-8)Online publication date: 13-Oct-2024
  • (2024)I've Got the Data in My Pocket! - Exploring Interaction Techniques with Everyday Objects for Cross-Device Data TransferProceedings of Mensch und Computer 202410.1145/3670653.3670778(242-255)Online publication date: 1-Sep-2024
  • (2024)Beyond Radar Waves: The First Workshop on Radar-Based Human-Computer InteractionCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3662836(97-102)Online publication date: 24-Jun-2024
  • (2024)Engineering Touchscreen Input for 3-Way Displays: Taxonomy, Datasets, and ClassificationCompanion Proceedings of the 16th ACM SIGCHI Symposium on Engineering Interactive Computing Systems10.1145/3660515.3661331(57-65)Online publication date: 24-Jun-2024
  • (2024)A Type System for Flexible User Interactions HandlingProceedings of the ACM on Human-Computer Interaction10.1145/36602488:EICS(1-27)Online publication date: 17-Jun-2024
  • (2024)User Preferences for Interactive 3D Object Transitions in Cross Reality - An Elicitation StudyProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656698(1-9)Online publication date: 3-Jun-2024
  • (2024)Feminist Interaction Techniques: Social Consent Signals to Deter NCIM ScreenshotsProceedings of the 37th Annual ACM Symposium on User Interface Software and Technology10.1145/3654777.3676380(1-14)Online publication date: 13-Oct-2024
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media