Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/2858036.2858284acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry

Published: 07 May 2016 Publication History

Abstract

In this paper, we present PathSync, a novel, distal and multi-user mid-air gestural technique based on the principle of rhythmic path mimicry; by replicating the movement of a screen-represented pattern with their hand, users can intuitively interact with digital objects quickly, and with a high level of accuracy. We present three studies that each contribute (1) improvements to how correlation is calculated in path-mimicry techniques necessary for touchless interaction, (2) a validation of its efficiency in comparison to existing techniques, and (3) a demonstration of its intuitiveness and multi-user capacity 'in the wild'. Our studies consequently demonstrate PathSync's potential as an immediately legitimate alternative to existing techniques, with key advantages for public display and multi-user applications.

Supplementary Material

suppl.mov (pn1287-file3.mp4)
Supplemental video
MP4 File (p3415-carter.mp4)

References

[1]
Carmelo Ardito, Paolo Buono, Maria Francesca Costabile, and Giuseppe Desolda. 2015. Interaction with Large Displays: A Survey. ACM Comput. Surv. 47, 3, Article 46 (February 2015), 38 pages. DOI=10.1145/2682623
[2]
Ali Bigdelou, Loren Schwarz, and Nassir Navab. 2012. An adaptive solution for intra-operative gesturebased human-machine interaction. In Proceedings of the 2012 ACM international conference on Intelligent User Interfaces (IUI '12), USA, 75--84. DOI=10.1145/2166966.2166981
[3]
Renaud Blanch, Yves Guiard, and Michel BeaudouinLafon. 2004. Semantic pointing: improving target acquisition with control-display ratio adaptation. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '04), 519--526. DOI=http://dx.doi.org/10.1145/985692.985758
[4]
Arthur Theil Cabreira and Faustina Hwang. 2015. An analysis of mid-air gestures used across three platforms. In Proceedings of the 2015 British HCI Conference (British HCI '15), USA, 257--258. DOI=10.1145/2783446.2783599
[5]
Marcus Carter, John Downs, Bjorn Nansen, Mitchell Harrop, and Martin Gibbs. 2014. Paradigms of games research in HCI: a review of 10 years of research at CHI. In Proceedings of the first ACM SIGCHI annual symposium on Computer-human interaction in play (CHI PLAY '14), 27--36. DOI=10.1145/2658537.2658708
[6]
Debaleena Chattopadhyay and Davide Bolchini. 2014. Touchless circular menus: toward an intuitive UI for touchless interactions with large displays. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14), 33--40. DOI=10.1145/2598153.2598181
[7]
John Downs, Frank Vetere, and Wally Smith. 2015. Differentiated Participation in Social Videogaming. In Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction (OzCHI '15), 92--100. DOI=http://dx.doi.org/10.1145/2838739.2838777
[8]
Augusto Esteves, Eduardo Velloso, Andreas Bulling and Hans Gellersen. 2015. Orbits: Enabling Gaze Interaction in Smart Watches Using Moving Targets. In Proceedings of the 2015 ACM Conference on Pervasive and Ubiquitous Computing (UbiComp '15).
[9]
Augusto Esteves, Eduardo Velloso, Andreas Bulling and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proceedings of the 28th ACM User Interface Software and Technology Symposium (UIST'15).
[10]
Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: target selection using elliptical motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '09), 289--298. DOI=10.1145/1518701.1518748
[11]
Lyndsey Fisk, Marcus Carter, Behnaz Rostami Yeganeh, Frank Vetere, and Bernd Ploderer. 2014. Implicit and explicit interactions in video mediated collaboration. In Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design (OzCHI '14), 250--259. DOI=10.1145/2686612.2686650
[12]
Simon Fothergill, Helena Mentis, Pushmeet Kohli, and Sebastian Nowozin. 2012. Instructing people for training gestural interactive systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12), 1737--1746. DOI=10.1145/2207676.2208303
[13]
Franca Garzotto and Matteo Valoriani. 2013. Touchless gestural interaction with small displays: a case study. In Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI (CHItaly '13), USA, Article 26, 10 pages. DOI=10.1145/2499149.2499154
[14]
Emilien Ghomi, Guillaume Faure, Stéphane Huot, Olivier Chapuis, and Michel Beaudouin-Lafon. 2012. Using rhythmic patterns as an input method. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12, 1253--1262. DOI=10.1145/2207676.2208579
[15]
Sukeshini A. Grandhi, Gina Joue, and Irene Mittelberg. 2011. Understanding naturalness and intuitiveness in gesture production: insights for touchless gestural interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), USA, 821--824. DOI=10.1145/1978942.1979061
[16]
Darren Guinness, Alvin Jude, G. Michael Poor, and Ashley Dover. 2015. Models for Rested Touchless Gestural Interaction. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI '15), 3443. DOI=10.1145/2788940.2788948
[17]
John Hardy, Enrico Rukzio, and Nigel Davies. 2011. Real world responses to interactive gesture based public displays. In Proceedings of the 10th International Conference on Mobile and Ubiquitous Multimedia (MUM '11), 33--39. DOI=10.1145/2107596.2107600
[18]
Juan David Hincapié-Ramos, Xiang Guo, and Pourang Irani. 2014. The consumed endurance workbench: a tool to assess arm fatigue during mid-air interactions. In Proceedings of the 2014 companion publication on Designing interactive systems (DIS Companion '14), 109--112. DOI=10.1145/2598784.2602795
[19]
Chien-Ho Janice Lin, Katherine J. Sullivan, Allan D. Wu, Shailesh Kantak, and Carolee J. Winstein. 2007. Effect of task practice order on motor skill learning in adults with Parkinson disease: a pilot study. Physical therapy 87, 9: 1120--1131.
[20]
Alvin Jude, G. Michael Poor, and Darren Guinness. 2014. Personal space: user defined gesture space for GUI interaction. In CHI '14 Extended Abstracts on Human Factors in Computing Systems (CHI EA '14), USA, 1615--1620. DOI=10.1145/2559206.2581242
[21]
Timothy Lee and Elizabeth Genovese. 1988. Distribution of practice in motor skill acquisition: Learning and performance effects reconsidered. Research Quarterly for Exercise and Sport 59, 4: 277--287.
[22]
Ville Mäkelä, Tomi Heimonen, and Markku Turunen. 2014. Magnetic Cursor: Improving Target Selection in Freehand Pointing Interfaces. In Proceedings of The International Symposium on Pervasive Displays (PerDis '14), 112 -118, DOI=http://dx.doi.org/10.1145/2611009.2611025
[23]
Alessio Malizia and Andrea Bellucci. 2012. The artificiality of natural user interfaces. Commun. ACM 55, 3 (March 2012), 36--38. DOI=10.1145/2093548.2093563
[24]
Anders Markussen, Mikkel Rønne Jakobsen, and Kasper Hornbæk. 2014. Vulture: a mid-air wordgesture keyboard. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems (CHI '14), 1073--1082. DOI=10.1145/2556288.2556964
[25]
Paul Marshall, Richard Morris, Yvonne Rogers, Stefan Kreitmayer, and Matt Davies. 2011. Rethinking 'multiuser': an in-the-wild study of how groups approach a walk-up-and-use tabletop interface. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '11), 3033--3042. DOI=10.1145/1978942.1979392
[26]
Microsoft. Xbox 360 Kinect Gestures. Retrieved September 3, 2015 from http://support.xbox.com/enAU/xbox-360/kinect/body-controller.
[27]
George Miller. 1956. The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychology Review 63, 2: 81.
[28]
Calkin S. Montero, Jason Alexander, Mark T. Marshall, and Sriram Subramanian. 2010. Would you do that?: understanding social acceptance of gestural interfaces. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services(MobileHCI '10), 275--278. DOI=10.1145/1851600.1851647
[29]
Tomoya Murata and Jungpil Shin. 2014. Hand Gesture and Character Recognition Based on Kinect Sensor. International Journal of Distributed Sensor Networks, Article ID 278460: 6 pages
[30]
Matei. Negulescu, Jaime Ruiz and Edward Lank. 2010. Exploring usability and learnability of mode inferencing in pen/tablet interfaces. In Proceedings of the Seventh Sketch-Based Interfaces and Modeling Symposium (SBIM '10), 87--94.
[31]
Matei Negulescu, Jaime Ruiz, and Edward Lank. 2012. A recognition safety net: bi-level threshold recognition for mobile motion gestures. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services (MobileHCI '12), 147--150. DOI=10.1145/2371574.2371598
[32]
Donald A. Norman and Jakob Nielsen. 2010. Gestural interfaces: a step backward in usability. interactions 17, 5 (September 2010), 46--49. DOI=10.1145/1836216.1836228
[33]
Kenton O'Hara, Richard Harper, Helena Mentis, Abigail Sellen, and Alex Taylor. 2013. On the naturalness of touchless: Putting the "interaction" back into NUI. ACM Trans. Comput.-Hum. Interact. 20, 1, Article 5 (April 2013), 25 pages. DOI=10.1145/2442106.2442111
[34]
Kenton O'Hara, Gerardo Gonzalez, Abigail Sellen, Graeme Penney, Andreas Varnavas, Helena Mentis, Antonio Criminisi, Robert Corish, Mark Rouncefield, Neville Dastur, and Tom Carrell. 2014. Touchless interaction in surgery. Commun. ACM 57, 1 (January 2014), 70--77. DOI=10.1145/2541883.2541899
[35]
Thammathip Piumsomboon, Adrian Clark, Mark Billinghurst, and Andy Cockburn. 2013. User-defined gestures for augmented reality. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13), 955--960. DOI=10.1145/2468356.2468527
[36]
Gustavo Rovelo, Donald Degraen, Davy Vanacken, Kris Luyten and Karin Connix. 2015. Gestu-Wan - An Intelligible Mid-Air Gesture Guidance System for Walk-up-and-Use Displays. In Proceedings of the International Conference on Human-Computer Interaction (INTERACT'15), 368--386. DOI= 10.1007/978--3--319--22668--2_28
[37]
Hassan Saidinejad, Mahsa Teimourikia, Sara Comai, and Fabio Salice. 2014. Static hand poses for gestural interaction: a study. In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14), USA, 379--380. DOI=10.1145/2598153.2600049
[38]
Lawrence Sambrooks and Brett Wilkinson. 2013. Comparison of gestural, touch, and mouse interaction with Fitts' law. In Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration (OzCHI '13), 119--122. DOI=10.1145/2541016.2541066
[39]
Matthias Schwaller and Denis Lalanne. 2013. Pointing in the air: Measuring the effect of hand selection strategies on performance and effort." Human Factors in Computing and Informatics. Springer Berlin Heidelberg: 732--747.
[40]
Julia Schwarz, Charles Claudius Marais, Tommer Leyvand, Scott E. Hudson, and Jennifer Mankoff. 2014. Combining body pose, gaze, and gesture to determine intention to interact in vision-based interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14), 3443--3452.
[41]
Ben Shneiderman. 1997. Direct manipulation for comprehensible, predictable and controllable user interfaces. In Proceedings of the 2nd international conference on Intelligent user interfaces (IUI '97), 3339. DOI=10.1145/238218.238281
[42]
Helman Stern, Juan Wachs and Yael Edan. 2008. Optimal Consensus Intuitive Hand Gesture Vocabulary Design. In 2008 IEEE Int. Conference on Semantic Computing, 96--103. DOI= 10.1109/ICSC.2008.29.
[43]
Radu-Daniel Vatavu and Ionut-Alexandru Zaiti. 2014. Leap gestures for TV: insights from an elicitation study. In Proceedings of the 2014 ACM international conference on Interactive experiences for TV and online video (TVX '14), 131--138. DOI=10.1145/2602299.2602316
[44]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing (UbiComp '13), 439--448. DOI=10.1145/2493432.2493477
[45]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: Spontaneous Eye-Based Interaction for Dynamic Interfaces. GetMobile: Mobile Comp. and Comm. 18, 4 (January 2015), 8--10. DOI=10.1145/2721914.2721917
[46]
Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans W. Gellersen. 2013. Pursuits: eye-based interaction with moving targets. In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13), 3147--3150. DOI=10.1145/2468356.2479632
[47]
Daniel Vogel and Ravin Balakrishnan. 2005. Distant freehand pointing and clicking on very large, high resolution displays. In Proceedings of the 18th annual ACM symposium on User interface software and technology (UIST '05), 33--42. DOI=10.1145/1095034.1095041
[48]
Robert Walter, Gilles Bailly, and Jörg Müller. 2013. StrikeAPose: revealing mid-air gestures on public displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13), 841850. DOI=10.1145/2470654.2470774
[49]
Daniel Wigdor and Dennis Wixon. 2011. Brave NUI World: Designing Natural User Interfaces for Touch and Gesture. Morgan Kaufman: London.
[50]
John Williamson and Roderick Murray-Smith. 2004. Pointing without a pointer. In CHI '04 Extended Abstracts on Human Factors in Computing Systems (CHI EA '04), 1407--1410. DOI=10.1145/985921.986076
[51]
Soojeong Yoo, Callum Parker, Judy Kay, and Martin Tomitsch. 2015. To Dwell or Not to Dwell: An Evaluation of Mid-Air Gestures for Large Information Displays. In Proceedings of the Annual Meeting of the Australian Special Interest Group for Computer Human Interaction (OzCHI '15), 187--191. DOI=http://dx.doi.org/10.1145/2838739.2838819
[52]
Bruno Zamborlin, Frederic Bevilacqua, Marco Gillies, and Mark D'inverno. 2014. Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces. ACM Trans. Interact. Intell. Syst. 3, 4, Article 22 (January 2014), 30 pages.

Cited By

View all
  • (2024)Do I Just Tap My Headset?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314517:4(1-28)Online publication date: 12-Jan-2024
  • (2024)Detection of visual pursuits using 1D convolutional neural networksPattern Recognition Letters10.1016/j.patrec.2024.01.020179:C(45-51)Online publication date: 1-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • Show More Cited By

Index Terms

  1. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CHI '16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems
    May 2016
    6108 pages
    ISBN:9781450333627
    DOI:10.1145/2858036
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 07 May 2016

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. kinect
    2. pathsync
    3. touchless interaction

    Qualifiers

    • Research-article

    Conference

    CHI'16
    Sponsor:
    CHI'16: CHI Conference on Human Factors in Computing Systems
    May 7 - 12, 2016
    California, San Jose, USA

    Acceptance Rates

    CHI '16 Paper Acceptance Rate 565 of 2,435 submissions, 23%;
    Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

    Upcoming Conference

    CHI 2025
    ACM CHI Conference on Human Factors in Computing Systems
    April 26 - May 1, 2025
    Yokohama , Japan

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)109
    • Downloads (Last 6 weeks)5
    Reflects downloads up to 28 Dec 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Do I Just Tap My Headset?Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/36314517:4(1-28)Online publication date: 12-Jan-2024
    • (2024)Detection of visual pursuits using 1D convolutional neural networksPattern Recognition Letters10.1016/j.patrec.2024.01.020179:C(45-51)Online publication date: 1-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-428:5(763-778)Online publication date: 15-Jun-2024
    • (2023)EyePursuitLinks - an Eye-pursuit Based Interface for Web Browsing Using Smart TargetsProceedings of the 29th Brazilian Symposium on Multimedia and the Web10.1145/3617023.3617058(16-24)Online publication date: 23-Oct-2023
    • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
    • (2023)Rhythm Research in Interactive System Design: A Literature ReviewInternational Journal of Human–Computer Interaction10.1080/10447318.2023.2294628(1-20)Online publication date: 27-Dec-2023
    • (2023)The effect of hands synchronicity on users perceived arms Fatigue in Virtual reality environmentInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103092178:COnline publication date: 1-Oct-2023
    • (2022)Rhythmic-Synchronization-Based Interaction: Effect of Interfering Auditory Stimuli, Age and Gender on Users’ PerformancesApplied Sciences10.3390/app1206305312:6(3053)Online publication date: 17-Mar-2022
    • (2022)Push or Pinch? Exploring Slider Control Gestures for Touchless User InterfacesNordic Human-Computer Interaction Conference10.1145/3546155.3546702(1-10)Online publication date: 8-Oct-2022
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media