Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

Published: 11 September 2017 Publication History

Abstract

In this work we consider how users can use body movement for remote control with minimal effort and maximum flexibility. TraceMatch is a novel technique where the interface displays available controls as circular widgets with orbiting targets, and where users can trigger a control by mimicking the displayed motion. The technique uses computer vision to detect circular motion as a uniform type of input, but is highly appropriable as users can produce matching motion with any part of their body. We present three studies that investigate input performance with different parts of the body, user preferences, and spontaneous choice of movements for input in realistic application scenarios. The results show that users can provide effective input with their head, hands and while holding objects, that multiple controls can be effectively distinguished by the difference in presented phase and direction of movement, and that users choose and switch modes of input seamlessly.

Supplementary Material

clarke (clarke.zip)
Supplemental movie, appendix, image and software files for, Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

References

[1]
Christopher Ackad, Andrew Clayphan, Martin Tomitsch, and Judy Kay. 2015. An In-the-wild Study of Learning Mid-air Gestures to Browse Hierarchical Information at a Large Interactive Public Display. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’15). ACM, New York, NY, USA, 1227--1238.
[2]
Thomas Baudel and Michel Beaudouin-Lafon. 1993. Charade: remote control of objects using free-hand gestures. Commun. ACM 36, 7 (1993), 28--35.
[3]
AMV BBDO. 2011. Choose a Different Ending. Video. (23 November 2011). Retrieved September 10, 2016 from https://www.youtube.com/watch?v=Ig5OUQcqLTg.
[4]
Peter Bennett, Stuart Nolan, Ved Uttamchandani, Michael Pages, Kirsten Cater, and Mike Fraser. 2015. Resonant Bits: Harmonic Interaction with Virtual Pendulums. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction (TEI ’15). ACM, New York, NY, USA, 49--52.
[5]
Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O’Hara, and Frank Vetere. 2016. PathSync: Multi-User Gestural Interaction with Touchless Rhythmic Path Mimicry. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16). ACM, New York, NY, USA, 13.
[6]
Ming-yu Chen, Lily Mummert, Padmanabhan Pillai, Alexander Hauptmann, and Rahul Sukthankar. 2010. Controlling Your TV with Gestures. In Proceedings of the International Conference on Multimedia Information Retrieval (MIR ’10). ACM, New York, NY, USA, 405--408.
[7]
Christopher Clarke, Alessio Bellino, Augusto Esteves, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: A Computer Vision Technique for User Input by Tracing of Animated Controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’16). ACM, New York, NY, USA, 298--303.
[8]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015. Orbits: Gaze Interaction for Smart Watches using Smooth Pursuit Eye Movements. In Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015) (2015-11-01).
[9]
Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target Selection Using Elliptical Motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’09). ACM, New York, NY, USA, 289--298.
[10]
William T. Freeman and Craig D. Weissman. 1994. Television Control by Hand Gestures. (1994).
[11]
Yves Guiard. 1993. On Fitts’ and Hooke’s laws: simple harmonic movement in upper-limb cyclical aiming. Acta Psychologica 82, 1-3 (1993), 139--159.
[12]
Darren Guinness, Alvin Jude, G. Michael Poor, and Ashley Dover. 2015. Models for Rested Touchless Gestural Interaction. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI ’15). ACM, New York, NY, USA, 34--43.
[13]
Juan David Hincapié-Ramos, Xiang Guo, and Pourang Irani. 2014. The Consumed Endurance Workbench: A Tool to Assess Arm Fatigue During Mid-air Interactions. In Proceedings of the 2014 Companion Publication on Designing Interactive Systems (DIS Companion ’14). ACM, New York, NY, USA, 109--112.
[14]
Ken Hinckley. 2003. Synchronous Gestures for Multiple Persons and Computers. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (UIST ’03). ACM, New York, NY, USA, 149--158.
[15]
Lars Erik Holmquist, Friedemann Mattern, Bernt Schiele, Petteri Alahuhta, Michael Beigl, and Hans-Werner Gellersen. 2001. Smart-Its Friends: A Technique for Users to Easily Establish Connections Between Smart Artefacts. In Proceedings of the 3rd International Conference on Ubiquitous Computing (UbiComp ’01). Springer-Verlag, London, UK, UK, 116--122. http://dl.acm.org/citation.cfm?id=647987.741340
[16]
Inwook Hwang, Hyun-Cheol Kim, Jihun Cha, Chunghyun Ahn, Karam Kim, and Jong-Il Park. 2015. A gesture based tv control interface for visually impaired: Initial design and user study. In Frontiers of Computer Vision (FCV), 2015 21st Korea-Japan Joint Workshop on. IEEE, 1--5.
[17]
Soonmook Jeong, Jungdong Jin, Taehoun Song, Keyho Kwon, and Jae Wook Jeon. 2012. Single-camera dedicated television control system using gesture drawing. IEEE Transactions on Consumer Electronics 58, 4 (2012), 1129--1137.
[18]
Tiiu Koskela and Kaisa Väänänen-Vainio-Mattila. 2004. Evolution towards smart home environments: empirical evaluation of three user interfaces. Personal and Ubiquitous Computing 8, 3-4 (2004), 234--240.
[19]
Alfred Kuhn and William T. Powers. 1975. Behavior: The Control of Perception. Vol. 4. SAGE Publications. 306 pages.
[20]
W. C. Luplow and J. I. Taylor. 2012. Channel Surfing Redux: A Brief History of the TV Remote Control and a Tribute to Its Coinventors. IEEE Consumer Electronics Magazine 1, 4 (Oct 2012), 24--29.
[21]
Sylvain Malacria, Eric Lecolinet, and Yves Guiard. 2010. Clutch-free Panning and Integrated Pan-zoom Control on Touch-sensitive Surfaces: The Cyclostar Approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’10). ACM, New York, NY, USA, 2615--2624.
[22]
Rene Mayrhofer and Hans Gellersen. 2009. Shake well before use: Intuitive and secure pairing of mobile devices. Mobile Computing, IEEE Transactions on 8, 6 (2009), 792--806.
[23]
Donald A. Norman and Jakob Nielsen. 2010. Gestural Interfaces: A Step Backward in Usability. interactions 17, 5 (Sept. 2010), 46--49.
[24]
Shwetak N. Patel, Jeffrey S. Pierce, and Gregory D. Abowd. 2004. A Gesture-based Authentication Scheme for Untrusted Public Terminals. In Proc. of ACM Symp. on User Interf. Softw. 8 Techn. (UIST ’04). 157--160.
[25]
Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST ’13). ACM, New York, NY, USA, 261--270.
[26]
Mahsan Rofouei, Andrew Wilson, A.J. Brush, and Stewart Tansley. 2012. Your Phone or Mine?: Fusing Body, Touch and Device Sensing for Multi-user Device-display Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’12). ACM, New York, NY, USA, 1915--1918.
[27]
Dominik Schmidt, Fadi Chehimi, Enrico Rukzio, and Hans Gellersen. 2010. PhoneTouch: A Technique for Direct Phone Interaction on Surfaces. In Proceedings of the 23Nd Annual ACM Symposium on User Interface Software and Technology (UIST ’10). ACM, New York, NY, USA, 13--16.
[28]
Dominik Schmidt, Julian Seifert, Enrico Rukzio, and Hans Gellersen. 2012. A Cross-device Interaction Style for Mobiles and Surfaces. In Proceedings of the Designing Interactive Systems Conference (DIS ’12). ACM, New York, NY, USA, 318--327.
[29]
Toby Sharp, Cem Keskin, Duncan Robertson, Jonathan Taylor, Jamie Shotton, David Kim, Christoph Rhemann, Ido Leichter, Alon Vinnikov, Yichen Wei, Daniel Freedman, Pushmeet Kohli, Eyal Krupka, Andrew Fitzgibbon, and Shahram Izadi. 2015. Accurate, Robust, and Flexible Real-time Hand Tracking. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI ’15). ACM, New York, NY, USA, 3633--3642.
[30]
Jamie Shotton, Toby Sharp, Alex Kipman, Andrew Fitzgibbon, Mark Finocchio, Andrew Blake, Mat Cook, and Richard Moore. 2013. Real-time human pose recognition in parts from single depth images. Commun. ACM 56, 1 (2013), 116--124.
[31]
Radu-Daniel Vatavu. 2012. User-defined Gestures for Free-hand TV Control. In Proceedings of the 10th European Conference on Interactive Tv and Video (EuroiTV ’12). ACM, New York, NY, USA, 45--48.
[32]
Eduardo Velloso, Marcus Carter, Joshua Newn, Augusto Esteves, Christopher Clarke, and Hans Gellersen. 2017. Motion Correlation: Selecting Objects by Matching Their Movement. ACM Trans. Comput.-Hum. Interact. 24, 3, Article 22 (April 2017), 35 pages.
[33]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proceedings of the Designing Interactive Systems Conference (DIS ’16). ACM, New York, NY, USA, 4.
[34]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’13). ACM, New York, NY, USA, 439--448.
[35]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous Interaction with Displays Based on Smooth Pursuit Eye Movement and Moving Targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’13). ACM, New York, NY, USA, 439--448.
[36]
Robert Walter, Gilles Bailly, and Jörg Müller. 2013. StrikeAPose: revealing mid-air gestures on public displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 841--850.
[37]
John Williamson. 2006. Continuous uncertain interaction. Ph.D. Dissertation. University of Glasgow.
[38]
John Williamson and Roderick Murray-Smith. 2004. Pointing Without a Pointer. In CHI ’04 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’04). ACM, New York, NY, USA, 1407--1410.
[39]
Jacob O. Wobbrock, Htet Htet Aung, Brandon Rothrock, and Brad A. Myers. 2005. Maximizing the Guessability of Symbolic Input. In CHI ’05 Extended Abstracts on Human Factors in Computing Systems (CHI EA ’05). ACM, New York, NY, USA, 1869--1872.

Cited By

View all
  • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-428:5(763-778)Online publication date: 1-Oct-2024
  • (2023)ThingShare: Ad-Hoc Digital Copies of Physical Objects for Sharing Things in Video MeetingsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581148(1-22)Online publication date: 19-Apr-2023
  • (2022)Rhythmic-Synchronization-Based Interaction: Effect of Interfering Auditory Stimuli, Age and Gender on Users’ PerformancesApplied Sciences10.3390/app1206305312:6(3053)Online publication date: 17-Mar-2022
  • Show More Cited By

Index Terms

  1. Remote Control by Body Movement in Synchrony with Orbiting Widgets: an Evaluation of TraceMatch

        Recommendations

        Comments

        Information & Contributors

        Information

        Published In

        cover image Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies
        Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies  Volume 1, Issue 3
        September 2017
        2023 pages
        EISSN:2474-9567
        DOI:10.1145/3139486
        Issue’s Table of Contents
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 11 September 2017
        Accepted: 01 September 2017
        Revised: 01 June 2017
        Received: 01 February 2017
        Published in IMWUT Volume 1, Issue 3

        Permissions

        Request permissions for this article.

        Check for updates

        Author Tags

        1. Computer vision
        2. Gesture input
        3. Input techniques
        4. Motion correlation
        5. Motion matching
        6. Movement correlation
        7. Path mimicry
        8. Remote control
        9. User evaluation
        10. User input
        11. Vision-based interfaces

        Qualifiers

        • Research-article
        • Research
        • Refereed

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)13
        • Downloads (Last 6 weeks)0
        Reflects downloads up to 11 Jan 2025

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-428:5(763-778)Online publication date: 1-Oct-2024
        • (2023)ThingShare: Ad-Hoc Digital Copies of Physical Objects for Sharing Things in Video MeetingsProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3581148(1-22)Online publication date: 19-Apr-2023
        • (2022)Rhythmic-Synchronization-Based Interaction: Effect of Interfering Auditory Stimuli, Age and Gender on Users’ PerformancesApplied Sciences10.3390/app1206305312:6(3053)Online publication date: 17-Mar-2022
        • (2022)One-handed Input for Mobile Devices via Motion Matching and Orbits ControlsProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/35346246:2(1-24)Online publication date: 7-Jul-2022
        • (2020)Facilitating Temporal Synchronous Target Selection through User Behavior ModelingProceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies10.1145/33698393:4(1-24)Online publication date: 14-Sep-2020
        • (2020)Comparing Selection Mechanisms for Gaze Input Techniques in Head-mounted DisplaysInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2020.102414(102414)Online publication date: Feb-2020
        • (2019)DialPlatesProceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia10.1145/3365610.3365626(1-10)Online publication date: 26-Nov-2019
        • (2019)WattomProceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3294109.3295642(307-313)Online publication date: 17-Mar-2019
        • (2019)Designing Motion Matching for Real-World ApplicationsProceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction10.1145/3294109.3295628(645-656)Online publication date: 17-Mar-2019
        • (2018)How Memorizing Positions or Directions Affects Gesture Learning?Proceedings of the 2018 ACM International Conference on Interactive Surfaces and Spaces10.1145/3279778.3279787(107-114)Online publication date: 19-Nov-2018
        • Show More Cited By

        View Options

        Login options

        Full Access

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media