Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3531073.3531112acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaviConference Proceedingsconference-collections
research-article

RepliGES and GEStory: Visual Tools for Systematizing and Consolidating Knowledge on User-Defined Gestures

Published: 06 June 2022 Publication History

Abstract

The body of knowledge accumulated by gesture elicitation studies (GES), although useful, large, and extensive, is also heterogeneous, scattered in the scientific literature across different venues and fields of research, and difficult to generalize to other contexts of use represented by different gesture types, sensing devices, applications, and user categories. To address such aspects, we introduce RepliGES, a conceptual space that supports (1) replications of gesture elicitation studies to confirm, extend, and complete previous findings, (2) reuse of previously elicited gesture sets to enable new discoveries, and (3) extension and generalization of previous findings with new methods of analysis and for new user populations towards consolidated knowledge of user-defined gestures. Based on RepliGES, we introduce GEStory, an interactive design space and visual tool, to structure, visualize and identify user-defined gestures from a number of 216 published gesture elicitation studies.

Supplemental Material

ZIP File
Supplemental videos

References

[1]
ACM. 2018. Artifact Review and Badging – Version 1.0 (not current). https://www.acm.org/publications/policies/artifact-review-badging
[2]
ACM. 2020. Artifact Review and Badging - Current. https://www.acm.org/publications/policies/artifact-review-and-badging-current
[3]
Christopher Ahlberg and Ben Shneiderman. 1994. Visual Information Seeking: Tight Coupling of Dynamic Query Filters with Starfield Displays. In Proc. of CHI ’94. ACM, New York, NY, USA, 313–317. https://doi.org/10.1145/191666.191775
[4]
Roland Aigner, Daniel Wigdor, Hrvoje Benko, Michael Haller, David Lindbauer, Alexandra Ion, Shengdong Zhao, and Jeffrey Tzu Kwan Valino Koh. 2012. Understanding Mid-Air Hand Gestures: A Study of Human Preferences in Usage of Gesture Types for HCI. Technical Report MSR-TR-2012-111. Microsoft Research.
[5]
Abdullah X. Ali, Meredith Ringel Morris, and Jacob O. Wobbrock. 2018. Crowdsourcing Similarity Judgments for Agreement Analysis in End-User Elicitation Studies. In UIST ’18. ACM, New York, NY, USA, 177–188.
[6]
Abdullah X. Ali, Meredith Ringel Morris, and Jacob O. Wobbrock. 2019. Crowdlicit: A System for Conducting Distributed End-User Elicitation and Identification Studies. In Proc. of CHI ’19. ACM, New York, NY, USA, Article 255, 12 pages.
[7]
Gilles Bailly, Thomas Pietrzak, Jonathan Deber, and Daniel J. Wigdor. 2013. Métamorphe: Augmenting Hotkey Usage with Actuated Keys. In CHI ’13. ACM, New York, NY, USA, 563–572. https://doi.org/10.1145/2470654.2470734
[8]
Fabien C.Y. Benureau and Nicolas P. Rougier. 2018. Re-run, Repeat, Reproduce, Reuse, Replicate: Transforming Code into Scientific Contributions. Frontiers in Neuroinformatics 11 (2018), 69. https://doi.org/10.3389/fninf.2017.00069
[9]
Laura Bianca Bilius and Radu-Daniel Vatavu. 2021. A Multistudy Investigation of Drivers and Passengers’ Gesture and Voice Input Preferences for In-Vehicle Interactions. Journal of Intelligent Transportation Systems 25, 2 (2021), 197–220.
[10]
Gaëlle Calvary, Joëlle Coutaz, David Thevenin, Quentin Limbourg, Laurent Bouillon, and Jean Vanderdonckt. 2003. A Unifying Reference Framework for multi-target user interfaces. Interacting with Computers 15, 3 (2003), 289–308. https://doi.org/10.1016/S0953-5438(03)00010-9
[11]
Edwin Chan, Teddy Seyed, Wolfgang Stuerzlinger, Xing-Dong Yang, and Frank Maurer. 2016. User Elicitation on Single-Hand Microgestures. In CHI’16. ACM, New York, NY, USA, 3403–3414. https://doi.org/10.1145/2858036.2858589
[12]
Tilman Dingler, Rufat Rzayev, Alireza Sahami Shirazi, and Niels Henze. 2018. Designing Consistent Gestures Across Device Types: Eliciting RSVP Controls for Phone, Watch, and Glasses. In CHI ’18. ACM, New York, NY, USA, Article 419, 12 pages. https://doi.org/10.1145/3173574.3173993
[13]
Guiying Du, Auriol Degbelo, Christian Kray, and Marco Painho. 2018. Gestural Interaction with 3D Objects Shown on Public Displays: An Elicitation Study. IxD&A 38(2018), 184–202. https://dblp.org/rec/journals/ixda/DuDKP18
[14]
Florian Echtler and Maximilian Häuundefinedler. 2018. Open Source, Open Science, and the Replication Crisis in HCI. In CHI EA ’18. ACM, New York, NY, USA, Article alt02, 8 pages. https://doi.org/10.1145/3170427.3188395
[15]
Mingming Fan, Zhen Li, and Franklin Mingzhe Li. 2021. Eyelid Gestures for People with Motor Impairments. Commun. ACM 65, 1 (dec 2021), 108–115.
[16]
Bogdan-Florin Gheran, Jean Vanderdonckt, and Radu-Daniel Vatavu. 2018. Gestures for Smart Rings: Empirical Results, Insights, and Design Implications. In DIS ’18. ACM, New York, NY, USA, 623–635. https://doi.org/10.1145/3196709.3196741
[17]
Bogdan-Florin Gheran, Radu-Daniel Vatavu, and Jean Vanderdonckt. 2018. Ring X2: Designing Gestures for Smart Rings Using Temporal Calculus. In DIS ’18 Companion. ACM, NY, USA, 117–122. https://doi.org/10.1145/3197391.3205422
[18]
Omar S. Gómez, Natalia Juristo, and Sira Vegas. 2010. Replications Types in Experimental Disciplines. In ESEM ’10. ACM, New York, NY, USA, Article 3, 10 pages. https://doi.org/10.1145/1852786.1852790
[19]
Steven N. Goodman, Daniele Fanelli, and John P.A. Ioannidis. 2016. What Does Research Reproducibility Mean?Science Translational Medicine 8, 341 (June 2016), 341ps12. http://dx.doi.org/10.1126/scitranslmed.aaf5027
[20]
Christian Greiffenhagen and Stuart Reeves. 2013. Is Replication Important for HCI?. In Proc. of the CHI ’13 Workshop on the Replication of HCI Research. CEUR-WS, Paris, France, 1–6. http://ceur-ws.org/Vol-976
[21]
Yves Guiard. 1987. Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. Journal of Motor Behavior 19 (1987), 486–517. http://www.cogprints.org/625/1/jmb_87.html
[22]
Zhifan He, Ruifo Zhang, Zheng Liu, and Zhengyu Tan. 2020. A User-Defined Gesture Set for Natural Interaction in a Smart Kitchen Environment. In Proc. of ISCID ’20. IEEE, Washington, D.C., USA, 122–125.
[23]
Florian Heller, Kashyap Todi, and Kris Luyten. 2021. An Interactive Design Space for Wearable Displays. In Proc. of MobileHCI ’21. ACM, New York, NY, USA, Article 4, 14 pages. https://doi.org/10.1145/3447526.3472034
[24]
Lynn Hoff, Eva Hornecker, and Sven Bertel. 2016. Modifying Gesture Elicitation: Do Kinaesthetic Priming and Increased Production Reduce Legacy Bias?. In TEI ’16. ACM, New York, NY, USA, 86–91. https://doi.org/10.1145/2839462.2839472
[25]
Kasper Hornbæk. 2015. We Must Be More Wrong in HCI Research. Interactions 22, 6 (2015), 20–21. https://doi.org/10.1145/2833093
[26]
Kasper Hornbæk, Søren S. Sander, Javier Andrés Bargas-Avila, and Jakob Grue Simonsen. 2014. Is Once Enough? On the Extent and Content of Replications. In CHI ’14. ACM, NY, USA, 3523–3532. https://doi.org/10.1145/2556288.2557004
[27]
Robert Johansen. 1991. Groupware: Future directions and wild cards. J. Organizational Computing 1, 2 (1991), 219–227. https://doi.org/10.1080/10919399109540160
[28]
Adam Kendon. 1997. Gesture. Annual Review of Anthropology 26 (1997), 109–128. https://www.jstor.org/stable/2952517
[29]
Panayiotis Koutsabasis and Panagiotis Vogiatzidakis. 2019. Empirical Research in Mid-Air Interaction: A Systematic Review. Int. Journal of Human-Computer Interaction 35, 18 (2019), 1747–1768. https://doi.org/10.1080/10447318.2019.1572352
[30]
Byron Lahey, Audrey Girouard, Winslow Burleson, and Roel Vertegaal. 2011. PaperPhone: Understanding the Use of Bend Gestures in Mobile Devices with Flexible Electronic Paper Displays. In CHI ’11. ACM, New York, NY, USA, 1303–1312. https://doi.org/10.1145/1978942.1979136
[31]
Luis A. Leiva, Daniel Martín-Albo, and Radu-Daniel Vatavu. 2017. Synthesizing Stroke Gestures Across User Populations: A Case for Users with Visual Impairments. In Proc. of CHI ’17. ACM, New York, NY, USA, 4182–4193.
[32]
David R. Lenorovitz, Mark D. Phillips, R.S. Ardrey, and Gregory V. Kloster. 1984. A taxonomic approach to characterizing human-computer interaction. In Human-Computer Interaction. (1984). Elsevier Science Publishers, Amsterdam, 111–116.
[33]
Nathan Magrofuoco, Jorge Luis Pérez-Medina, Paolo Roselli, Jean Vanderdonckt, and Santiago Villarreal. 2019. Eliciting Contact-Based and Contactless Gestures With Radar-Based Sensors. IEEE Access 7(2019), 176982–176997.
[34]
Nathan Magrofuoco and Jean Vanderdonckt. 2019. Gelicit: A Cloud Platform for Distributed Gesture Elicitation Studies. PACMHCI 3, EICS, Article 6(2019), 41 pages. https://doi.org/10.1145/3331148
[35]
Meredith Ringel Morris. 2012. Web on the Wall: Insights from a Multimodal Interaction Elicitation Study. In ITS ’12. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/2396636.2396651
[36]
Meredith Ringel Morris, Andreea Danielescu, Steven Drucker, Danyel Fisher, Bongshin Lee, m.c. schraefel, and Jacob O. Wobbrock. 2014. Reducing Legacy Bias in Gesture Elicitation Studies. Interactions 21, 3 (2014), 40–45.
[37]
Miguel A. Nacenta, Yemliha Kamber, Yizhou Qiang, and Per Ola Kristensson. 2013. Memorability of Pre-Designed and User-Defined Gesture Sets. In CHI ’13. ACM, New York, NY, USA, 1099–1108. https://doi.org/10.1145/2470654.2466142
[38]
Michael Nebeling, Alexander Huber, David Ott, and Moira C. Norrie. 2014. Web on the Wall Reloaded: Implementation, Replication and Refinement of User-Defined Interaction Sets. In ITS ’14. ACM, New York, NY, USA, 15–24.
[39]
Prasad Patil, Roger D. Peng, and Jeffrey T. Leek. 2019. A Visual Tool for Defining Reproducibility and Replicability. Nat. Hum. Behav. 3(2019), 650–652.
[40]
Roger D. Peng. 2011. Reproducible Research in Computational Science. Science 334, 6060 (2011), 1226–1227. http://dx.doi.org/10.1126/science.1213847
[41]
Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-Defined Gestures for Augmented Reality. In CHI EA ’13. ACM, New York, NY, USA, 6 pages. doi.org/10.1145/2468356.2468527
[42]
Hans E. Plesser. 2018. Reproducibility vs. Replicability: A Brief History of a Confused Terminology. Front. Neuroinf. 11(2018), 76.
[43]
Stefan Pröll, Kristof Meixner, and Andreas Rauber. 2016. Precise Data Identification Services for Long Tail Research Data. http://dx.doi.org/10.6084/M9.FIGSHARE.3847632
[44]
Julie Rico and Stephen Brewster. 2010. Usable Gestures for Mobile Interfaces: Evaluating Social Acceptability. In Proc. of CHI ’10. ACM, New York, NY, USA, 887–896. https://doi.org/10.1145/1753326.1753458
[45]
Isabel B. Rodriguez and Nicolai Marquardt. 2017. Gesture Elicitation Study on How to Opt-in & Opt-out from Interactions with Public Displays. In Proc. of ISS ’17. ACM, New York, NY, USA, 32–41. https://doi.org/10.1145/3132272.3134118
[46]
Jaime Ruiz, Yang Li, and Edward Lank. 2011. User-Defined Motion Gestures for Mobile Interaction. In CHI ’11. ACM, New York, NY, USA, 197–206.
[47]
Karen Rust, Meethu Malu, Lisa Anthony, and Leah Findlater. 2014. Understanding Childdefined Gestures and Children’s Mental Models for Touchscreen Tabletop Interaction. In IDC ’14. ACM, New York, NY, USA, 201–204.
[48]
Geir Kjetil Sandve, Anton Nekrutenko, James Taylor, and Eivind Hovig. 2013. Ten Simple Rules for Reproducible Computational Research. PLoS Computational Biology 9, 10 (2013), e1003285. https://doi.org/10.1371/journal.pcbi.1003285
[49]
Ovidiu-Andrei Schipor and Radu-Daniel Vatavu. 2018. Invisible, Inaudible, and Impalpable: Users’ Preferences and Memory Performance for Digital Content in Thin Air. IEEE Pervasive Computing 17, 4 (2018), 76–85.
[50]
Jonathan W. Schooler. 2014. Metascience Could Rescue the “Replication Crisis”. Nature 515, 9 (2014), 9. http://dx.doi.org/10.1038/515009a
[51]
Douglas Schuler and Aki Namioka. 1993. Participatory Design: Principles and Practices. L. Erlbaum Associates Inc., Hillsdale, NJ.
[52]
Nikita Soni, Schuyler Gleaves, Hannah Neff, Sarah Morrison-Smith, Shaghayegh Esmaeili, Ian Mayne, Sayli Bapat, Carrie Schuman, Kathryn A. Stofer, and Lisa Anthony. 2019. Do User-Defined Gestures for Flatscreens Generalize to Interactive Spherical Displays for Adults and Children?. In PerDis ’19. ACM, New York, NY, USA, Article 24, 7 pages. https://doi.org/10.1145/3321335.3324941
[53]
Nikita Soni, Schuyler Gleaves, Hannah Neff, Sarah Morrison-Smith, Shaghayegh Esmaeili, Ian Mayne, Sayli Bapat, Carrie Schuman, Kathryn A. Stofer, and Lisa Anthony. 2020. Adults’ and Children’s Mental Models for Gestural Interactions with Interactive Spherical Displays. In CHI ’20. ACM, New York, NY, USA, 1–12.
[54]
Vuletic Tijana, Alex Duffy, Laura Hay, Chris McTeague, Gerard Campbell, and Madeleine Grealy. 2019. Systematic literature review of hand gestures used in human computer interaction interfaces. International Journal of Human-Computer Studies 129 (2019), 74–94. https://doi.org/10.1016/j.ijhcs.2019.03.011
[55]
Giovanni M. Troiano, Esben W. Pedersen, and Kasper Hornbæk. 2014. User-Defined Gestures for Elastic, Deformable Displays. In AVI ’14. ACM, USA, 1–8.
[56]
Theophanis Tsandilas. 2018. Fallacies of Agreement: A Critical Review of Consensus Assessment Methods for Gesture Elicitation. TOCHI 25, Article 18 (2018), 49 pages. http://doi.org/10.1145/3182168
[57]
Eric W.K. Tsang and Kai-Man Kwan. 1999. Replication and Theory Development in Organizational Science: A Critical Realist Perspective. The Academy of Management Review 24, 4 (1999), 759–780. http://dx.doi.org/10.2307/259353
[58]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2022. Clarifying Agreement Calculations and Analysis for End-User Elicitation Studies. ACM Trans. Comput. Hum. Interact. 29, 1 (2022), 5:1–5:70. https://doi.org/10.1145/3476101
[59]
Radu-Daniel Vatavu. 2012. User-Defined Gestures for Free-Hand TV Control. In EuroITV ’12. ACM, NY, USA, 45–48. https://doi.org/10.1145/2325616.2325626
[60]
Radu-Daniel Vatavu. 2019. The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. In CHI ’19. ACM, New York, NY, USA, Article 224, 13 pages. doi.org/10.1145/3290605.3300454
[61]
Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the Perceived Difficulty of Pen Gestures. In INTERACT’11. Springer-Verlag, Berlin, Heidelberg, 89–106. http://dx.doi.org/10.1007/978-3-642-23771-3_9
[62]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2015. Formalizing Agreement Analysis for Elicitation Studies: New Measures, Significance Test, and Toolkit. In CHI ’15. ACM, NY, USA, 1325–1334. http://dx.doi.org/10.1145/2702123.2702223
[63]
Radu-Daniel Vatavu and Jacob O. Wobbrock. 2016. Between-Subjects Elicitation Studies: Formalization and Tool Support. In CHI ’16. ACM, New York, NY, USA, 13 pages. http://dx.doi.org/10.1145/2858036.2858228
[64]
Radu-Daniel Vatavu and Ionut-Alexandru Zaiti. 2014. Leap Gestures for TV: Insights from an Elicitation Study. In TVX ’14. ACM, NY, USA, 131–138.
[65]
Santiago Villarreal-Narvaez, Jean Vanderdonckt, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2020. A Systematic Review of Gesture Elicitation Studies: What Can We Learn from 216 Studies?. In DIS ’20. ACM, New York, NY, USA, 855–872. https://doi.org/10.1145/3357236.3395511
[66]
Panagiotis Vogiatzidakis and Panayiotis Koutsabasis. 2018. Gesture Elicitation Studies for Mid-Air Interaction: A Review. MTI 2, 4 (2018), 1–21.
[67]
Martin Weigel, Vikram Mehta, and Jürgen Steimle. 2014. More than Touch: Understanding How People Use Skin as an Input Surface for Mobile Computing. In CHI ’14. ACM, NY, USA, 179–188. https://doi.org/10.1145/2556288.2557239
[68]
Max L. Wilson, Wendy Mackay, Ed Chi, Michael Bernstein, Dan Russell, and Harold Thimbleby. 2011. RepliCHI - CHI Should Be Replicating and Validating Results More: Discuss. In CHI EA’11. ACM, New York, NY, USA, 463–466.
[69]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI EA’05. ACM, New York, NY, USA, 4 pages. http://doi.org/10.1145/1056808.1057043
[70]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-Defined Gestures for Surface Computing. In CHI ’09. ACM, New York, NY, USA, 10 pages. https://doi.org/10.1145/1518701.1518866

Cited By

View all
  • (2024)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 12-Jan-2024
  • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
  • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
AVI '22: Proceedings of the 2022 International Conference on Advanced Visual Interfaces
June 2022
414 pages
ISBN:9781450397193
DOI:10.1145/3531073
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 06 June 2022

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Gesture elicitation studies
  2. generalization
  3. replicability
  4. reproducibility
  5. repurposing
  6. visual tools

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Data Availability

Funding Sources

  • Ministry of Research, Innovation and Digitization

Conference

AVI 2022

Acceptance Rates

Overall Acceptance Rate 128 of 490 submissions, 26%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)37
  • Downloads (Last 6 weeks)1
Reflects downloads up to 01 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Brave New GES World: A Systematic Literature Review of Gestures and Referents in Gesture Elicitation StudiesACM Computing Surveys10.1145/363645856:5(1-55)Online publication date: 12-Jan-2024
  • (2024)Take a Seat, Make a Gesture: Charting User Preferences for On-Chair and From-Chair Gesture InputProceedings of the 2024 CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642028(1-17)Online publication date: 11-May-2024
  • (2024)Exploring Methods to Optimize Gesture Elicitation Studies: A Systematic Literature ReviewIEEE Access10.1109/ACCESS.2024.338726912(64958-64979)Online publication date: 2024
  • (2024)Audio-visual training and feedback to learn touch-based gesturesJournal of Visualization10.1007/s12650-024-01012-x27:6(1117-1142)Online publication date: 17-Jun-2024
  • (2023)New Insights into User-Defined Smart Ring Gestures with Implications for Gesture Elicitation StudiesExtended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544549.3585590(1-8)Online publication date: 19-Apr-2023
  • (2023)Understanding Wheelchair Users’ Preferences for On-Body, In-Air, and On-Wheelchair GesturesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580929(1-16)Online publication date: 19-Apr-2023
  • (2023)iFAD Gestures: Understanding Users’ Gesture Input Performance with Index-Finger Augmentation DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580928(1-17)Online publication date: 19-Apr-2023
  • (2023)Flexible gesture input with radars: systematic literature review and taxonomy of radar sensing integration in ambient intelligence environmentsJournal of Ambient Intelligence and Humanized Computing10.1007/s12652-023-04606-914:6(7967-7981)Online publication date: 10-Apr-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media