Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3319499.3328227acmconferencesArticle/Chapter ViewAbstractPublication PageseicsConference Proceedingsconference-collections
short-paper

GestMan: a cloud-based tool for stroke-gesture datasets

Published: 18 June 2019 Publication History

Abstract

We introduce GestMan, a cloud-based GESTure MANagement tool to support the acquisition, design, and management of stroke-gesture datasets for interactive applications. GestMan stores stroke-gestures at multiple levels of representation, from individual samples to classes, clusters, and vocabularies and enables practitioners to process, analyze, classify, compile, and reconfigure sets of gesture commands according to the specific requirements of their applications, prototypes, and interactive systems. Our online tool enables acquisition of 2-D stroke-gestures via a HTML5-based user interface as well as 3-D touch+air and webcam-based gestures via dedicated mappers. GestMan implements five software quality characteristics of the ISO-25010 standard and employs a new mathematical formalization of stroke-gestures as vectors to support efficient computation of various gesture features.

References

[1]
Lisa Anthony, Radu-Daniel Vatavu, and Jacob O. Wobbrock. 2013. Understanding the Consistency of Users' Pen and Finger Stroke Gesture Articulation. In Proceedings of Graphics Interface 2013 (GI '13). Canadian Information Processing Society, Toronto, Ont., Canada, 87--94. http://dl.acm.org/citation.cfm?id=2532129.2532145
[2]
Caroline Appert and Shumin Zhai. 2009. Using strokes as command shortcuts: cognitive benefits and toolkit support. In Proceedings of the 27th International Conference on Human Factors in Computing Systems, CHI 2009, Boston, MA, USA, April 4-9, 2009. 2289--2298.
[3]
Daniel Ashbrook and Thad Starner. 2010. MAGIC: A Motion Gesture Design Tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '10). 2159--2168.
[4]
François Beuvens and Jean Vanderdonckt. 2012. Designing Graphical User Interfaces Integrating Gestures. In Proceedings of the 30th ACM International Conference on Design of Communication (SIGDOC '12). ACM, New York, NY, USA, 313--322.
[5]
Mingyu Chen, Ghassan AlRegib, and Biing-Hwang Juang. 2012. 6DMG: A New 6D Motion Gesture Database. In Proceedings of the 3rd Multimedia Systems Conference (MMSys '12). 83--88.
[6]
R. B. Dannenberg and D. Amon. 1989. A Gesture Based User Interface Prototyping System. In Proceedings of the 2nd Annual ACM SIGGRAPH Symposium on User Interface Software and Technology (UIST '89). ACM, New York, NY, USA, 127--132.
[7]
Simon Fothergill, Helena Mentis, Pushmeet Kohli, and Sebastian Nowozin. 2012. Instructing People for Training Gestural Interactive Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 1737--1746.
[8]
Bruno Galveia, Tiago Cardoso, Vitor Santor, and Yves Rybarczyk. 2015. Towards the creation of a Gesture Library. EAI Endorsed Transactions on Creative Technologies 2, 3 (6 2015).
[9]
Bogdan-Florin Gheran, Jean Vanderdonckt, and Radu-Daniel Vatavu. 2018. Gestures for Smart Rings: Empirical Results, Insights, and Design Implications. In Proceedings of the 2018 Designing Interactive Systems Conference (DIS '18). ACM, New York, NY, USA, 623--635.
[10]
Heloise Hse, Michael Shilman, and A. Richard Newton. 2004. Robust Sketched Symbol Fragmentation Using Templates. In Proceedings of the 9th International Conference on Intelligent User Interfaces (IUI '04). ACM, New York, NY, USA, 156--160. Retrieved September 9, 2017 from https://embedded.eecs.berkeley.edu/research/hhreco/.
[11]
D. Kohlsdorf, T. Starner, and D. Ashbrook. 2011. MAGIC 2.0: A web tool for false positive prediction and prevention for gesture recognition systems. In Face and Gesture 2011. 1--6.
[12]
Myeongcheol Kwak, Youngmong Park, Junyoung Kim, Jinyoung Han, and Taekyoung Kwon. 2018. An Energy-efficient and Lightweight Indoor Localization System for Internet-of-Things (IoT) Environments. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2, 1, Article 17 (March 2018), 28 pages.
[13]
Luis A. Leiva, Daniel Martín-Albo, and Réjean Plamondon. 2015. Gestures à Go Go: Authoring Synthetic Human-Like Stroke Gestures Using the Kinematic Theory of Rapid Movements. ACM Trans. Intell. Syst. Technol. 7, 2, Article 15 (Nov. 2015), 29 pages.
[14]
Luis A. Leiva, Daniel Martín-Albo, Réjean Plamondon, and Radu-Daniel Vatavu. 2018. KeyTime: Super-Accurate Prediction of Stroke Gesture Production Times. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 239, 12 pages.
[15]
Allan Christian Long, Jr., James A. Landay, and Lawrence A. Rowe. 1999. Implications for a Gesture Design Tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 40--47.
[16]
Kent Lyons, Helene Brashear, Tracy Westeyn, Jung Soo Kim, and Thad Starner. 2007. GART: The Gesture and Activity Recognition Toolkit. In Proceedings of the 12th International Conference on Human-computer Interaction: Intelligent Multimodal Interaction Environments (HCI'07). Springer-Verlag, Berlin, Heidelberg, 718--727. http://dl.acm.org/citation.cfm?id=1769590.1769671
[17]
Dean Rubine. 1991. Specifying gestures by example. In Proceedings of the 18th Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH 1991, Providence, RI, USA, April 27-30, 1991, James J. Thomas (Ed.). ACM, 329--337.
[18]
Lucio Davide Spano, Antonio Cisternino, Fabio Paternò, and Gianni Fenu. 2013. GestIT: A Declarative and Compositional Framework for Multiplatform Gesture Definition. In Proceedings of the 5th ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS '13). ACM, New York, NY, USA, 187--196.
[19]
Eugene M. Taranta, Andres N. Vargas, and Joseph J. LaViola. 2016. Streamlined and accurate gesture recognition with Penny Pincher. Computers Graphics 55 (2016), 130 -- 142.
[20]
Jean Vanderdonckt, Paolo Roselli, and Jorge Luis Pérez-Medina. 2018. !FTL, an Articulation-Invariant Stroke Gesture Recognizer with Controllable Position, Scale, and Rotation Invariances. In Proceedings of the 2018 on International Conference on Multimodal Interaction (ICMI '18). ACM, New York, NY, USA, 125--134.
[21]
Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2012. Gestures as point clouds: a $P recognizer for user interface prototypes. In International Conference on Multimodal Interaction, ICMI '12, Santa Monica, CA, USA, October 22-26, 2012. 273--280.
[22]
Radu-Daniel Vatavu. 2019. The Dissimilarity-Consensus Approach to Agreement Analysis in Gesture Elicitation Studies. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 224, 13 pages.
[23]
Radu-Daniel Vatavu, Lisa Anthony, and Jacob O. Wobbrock. 2018. $Q: A Super-quick, Articulation-invariant Stroke-gesture Recognizer for Low-resource Devices. In Proceedings of the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '18). ACM, New York, NY, USA, Article 23, 12 pages.
[24]
Radu-Daniel Vatavu, Gabriel Cramariuc, and Doina Maria Schipor. 2015. Touch Interaction for Children Aged 3 to 6 Years: Experimental Findings and Relationship to Motor Skills. International Journal of Human-Computer Studies 74 (2015), 54--76.
[25]
Radu-Daniel Vatavu and Ovidiu-Ciprian Ungurean. 2019. Stroke-Gesture Input for People with Motor Impairments: Empirical Results & Research Roadmap. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI '19). ACM, New York, NY, USA, Article 215, 14 pages.
[26]
Radu-Daniel Vatavu, Daniel Vogel, Géry Casiez, and Laurent Grisoni. 2011. Estimating the Perceived Difficulty of Pen Gestures. In Human-Computer Interaction - INTERACT 2011, Pedro Campos, Nicholas Graham, Joaquim Jorge, Nuno Nunes, Philippe Palanque, and Marco Winckler (Eds.). Springer, Berlin, 89--106.
[27]
Don Willems, Ralph Niels, Marcel van Gerven, and Louis Vuurpijl. 2009. Iconic and multi-stroke gesture recognition. Pattern Recognition 42, 12 (2009), 3303 -- 3312. New Frontiers in Handwriting Recognition.
[28]
Jacob O. Wobbrock, Meredith Ringel Morris, and Andrew D. Wilson. 2009. User-defined Gestures for Surface Computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09). ACM, New York, NY, USA, 1083--1092.
[29]
Jacob O. Wobbrock, Andrew D. Wilson, and Yang Li. 2007. Gestures Without Libraries, Toolkits or Training: A $1 Recognizer for User Interface Prototypes. In Proceedings of the 20th Annual ACM Symposium on User Interface Software and Technology (UIST '07). ACM, New York, NY, USA, 159--168.
[30]
Shumin Zhai, Per Ola Kristensson, Caroline Appert, Tue Haste Andersen, and Xiang Cao. 2012. Foundational Issues in Touch-Surface Stroke Gesture Design - An Integrative Review. Foundations and Trends in Human-Computer Interaction 5, 2 (2012), 97--205.

Cited By

View all
  • (2023)Gesture‐Based ComputingHandbook of Human‐Machine Systems10.1002/9781119863663.ch32(397-408)Online publication date: 7-Jul-2023
  • (2022)The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual EnvironmentsProceedings of Mensch und Computer 202210.1145/3543758.3543766(85-95)Online publication date: 4-Sep-2022
  • (2022)QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion ControllerProceedings of the ACM on Human-Computer Interaction10.1145/35322116:EICS(1-47)Online publication date: 17-Jun-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
EICS '19: Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems
June 2019
141 pages
ISBN:9781450367455
DOI:10.1145/3319499
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 June 2019

Permissions

Request permissions for this article.

Check for updates

Badges

  • Best Note

Author Tags

  1. cloud computing
  2. gesture data management
  3. gesture sets
  4. isochronicity
  5. isometricity
  6. isoparameterization
  7. stroke-gestures
  8. tool

Qualifiers

  • Short-paper

Conference

EICS '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 73 of 299 submissions, 24%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)17
  • Downloads (Last 6 weeks)1
Reflects downloads up to 18 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Gesture‐Based ComputingHandbook of Human‐Machine Systems10.1002/9781119863663.ch32(397-408)Online publication date: 7-Jul-2023
  • (2022)The Gesture Authoring Space: Authoring Customised Hand Gestures for Grasping Virtual Objects in Immersive Virtual EnvironmentsProceedings of Mensch und Computer 202210.1145/3543758.3543766(85-95)Online publication date: 4-Sep-2022
  • (2022)QuantumLeap, a Framework for Engineering Gestural User Interfaces based on the Leap Motion ControllerProceedings of the ACM on Human-Computer Interaction10.1145/35322116:EICS(1-47)Online publication date: 17-Jun-2022
  • (2022)Understanding Gesture Input Articulation with Upper-Body Wearables for Users with Upper-Body Motor ImpairmentsProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3501964(1-16)Online publication date: 29-Apr-2022
  • (2022)GearWheels: A Software Tool to Support User Experiments on Gesture Input with Wearable DevicesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.2098907(1-19)Online publication date: 22-Jul-2022
  • (2021)GestuRING: A Web-based Tool for Designing Gesture Input with Rings, Ring-Like, and Ring-Ready DevicesThe 34th Annual ACM Symposium on User Interface Software and Technology10.1145/3472749.3474780(710-723)Online publication date: 10-Oct-2021
  • (2021)How Do HCI Researchers Describe Their Software Tools? Insights From a Synopsis Survey of Tools for Multimodal InteractionCompanion Publication of the 2021 International Conference on Multimodal Interaction10.1145/3461615.3485431(7-12)Online publication date: 18-Oct-2021
  • (2021) PolyRec Gesture Design Tool : A tool for fast prototyping of gesture‐based mobile applications Software: Practice and Experience10.1002/spe.302452:2(594-618)Online publication date: 13-Sep-2021
  • (2020)DG3: Exploiting Gesture Declarative Models for Sample Generation and Online RecognitionProceedings of the ACM on Human-Computer Interaction10.1145/33978704:EICS(1-21)Online publication date: 18-Jun-2020

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media