Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Motion Correlation: Selecting Objects by Matching Their Movement

Published: 28 April 2017 Publication History

Abstract

Selection is a canonical task in user interfaces, commonly supported by presenting objects for acquisition by pointing. In this article, we consider motion correlation as an alternative for selection. The principle is to represent available objects by motion in the interface, have users identify a target by mimicking its specific motion, and use the correlation between the system’s output with the user’s input to determine the selection. The resulting interaction has compelling properties, as users are guided by motion feedback, and only need to copy a presented motion. Motion correlation has been explored in earlier work but only recently begun to feature in holistic interface designs. We provide a first comprehensive review of the principle, and present an analysis of five previously published works, in which motion correlation underpinned the design of novel gaze and gesture interfaces for diverse application contexts. We derive guidelines for motion correlation algorithms, motion feedback, choice of modalities, overall design of motion correlation interfaces, and identify opportunities and challenges identified for future research and design.

References

[1]
Graham R. Barnes. 2011. Ocular pursuit movements. The Oxford Handbook of Eye Movements. Oxford University Press, 115--132.
[2]
Peter J. Beek and Arthur Lewbel. 1995. The science of juggling. Scientific American 273, 5 (1995), 92--97.
[3]
Peter Bennett, Stuart Nolan, Ved Uttamchandani, Michael Pages, Kirsten Cater, and Mike Fraser. 2015. Resonant bits: Harmonic interaction with virtual pendulums. In Proceedings of the 9th International Conference on Tangible, Embedded, and Embodied Interaction (TEI’15). ACM, New York, NY, 49--52.
[4]
Daniel Boland and Roderick Murray-Smith. 2013. Finding my beat: Personalised rhythmic filtering for mobile music interaction. In Proceedings of the 15th International Conference on Human-computer Interaction with Mobile Devices and Services (MobileHCI’13). ACM, New York, NY, 21--30.
[5]
Thomas Berry Brazelton, Mary Louise Scholl, and John S. Robey. 1966. Visual responses in the newborn. Pediatrics 37, 2 (1966), 284--290.
[6]
Marcus Carter, Eduardo Velloso, John Downs, Abigail Sellen, Kenton O’Hara, and Frank Vetere. 2016. PathSync: Multi-user gestural interaction with touchless rhythmic path mimicry. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI’16). ACM, New York, NY, 3415--3427.
[7]
Tanya L. Chartrand and John A. Bargh. 1999. The chameleon effect: The perception-behavior link and social interaction. Journal of Personality and Social Psychology 76, 6 (1999), 893--910. http://dx.doi.org.arugula.cc.columbia.edu:2048/10.1037/0022-3514.76.6.893
[8]
Christopher Clarke, Alessio Bellino, Augusto Esteves, Eduardo Velloso, and Hans Gellersen. 2016. TraceMatch: A computer vision technique for user input by tracing of animated controls. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp’16). ACM, New York, NY, 298--303.
[9]
Travis Cox, Marcus Carter, and Eduardo Velloso. 2016. Public display: Social games on interactive public screens. In Proceedings of the 28th Australian Conference on Computer-human Interaction (OzCHI’16). ACM, New York, NY, 371--380.
[10]
Andrew Crossan and Roderick Murray-Smith. 2006. Rhythmic interaction for song filtering on a mobile device. In Proceedings of the 1st International Workshop on Haptic and Audio Interaction Design (HAID’06). David McGookin and Stephen Brewster (Eds.), Springer, Berlin, 45--55.
[11]
Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In Proceedings of the 11th IFIP TC 13 International Conference Human-Computer Interaction -- INTERACT 2007. Springer, Berlin, 475--488.
[12]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015b. Orbits: Enabling gaze interaction in smart watches using moving targets. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers (UbiComp/ISWC’15 Adjunct). ACM, New York, NY, 419--422.
[13]
Augusto Esteves, Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2015a. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proceedings of the 28th Annual ACM Symposium on User Interface Software 8 Technology (UIST’15). ACM, New York, NY, 457--466.
[14]
Jean-Daniel Fekete, Niklas Elmqvist, and Yves Guiard. 2009. Motion-pointing: Target selection using elliptical motions. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’09). ACM, New York, NY, 289--298.
[15]
Shimin Feng and Roderick Murray-Smith. 2016. Transformations of Gaussian process priors for user matching. International Journal of Human-Computer Studies 86, C (Feb. 2016), 32--47.
[16]
James D. Foley, Victor L. Wallace, and Peggy Chan. 1984. The human factors of computer graphics interaction techniques. IEEE Computer Graphics and Applications 4, 11 (Nov 1984), 13--48.
[17]
Jayden Garner, Gavin Wood, Sebastiaan Pijnappel, Martin Murer, and Florian Mueller. 2014. I-dentity: Innominate movement representation as engaging game element. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’14). ACM, New York, NY, 2181--2190.
[18]
Emilien Ghomi, Guillaume Faure, Stéphane Huot, Olivier Chapuis, and Michel Beaudouin-Lafon. 2012. Using rhythmic patterns as an input method. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, New York, NY, 1253--1262.
[19]
Leon Glass. 2001. Synchronization and rhythmic processes in physiology. Nature 410, 6825 (08 03 2001), 277--284. http://dx.doi.org/10.1038/35065745
[20]
Yves Guiard. 1993. On Fitts’ and Hooke’s laws: Simple harmonic movement in upper-limb cyclical aiming. Acta Psychologica 82, 1--3 (1993), 139--159.
[21]
Tyler J. Gunn, Pourang Irani, and John Anderson. 2009. An evaluation of techniques for selecting moving targets. In Proceedings of CHI’09 Extended Abstracts on Human Factors in Computing Systems (CHI EA’09). ACM, New York, NY, 3329--3334.
[22]
Reyne Haines. 2010. Vintage Watches (1st ed.). Krause Publications.
[23]
Hermann Haken, J. A. Scott Kelso, and Herbert Bunz. 1985. A theoretical model of phase transitions in human hand movements. Biological Cybernetics 51, 5 (1985), 347--356.
[24]
Khalad Hasan, Tovi Grossman, and Pourang Irani. 2011. Comet and target ghost: Techniques for selecting moving targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’11). ACM, New York, NY, 839--848.
[25]
Jutta Hild, Dennis Gill, and Jürgen Beyerer. 2014. Comparing mouse and MAGIC pointing for moving target acquisition. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA’14). ACM, New York, NY, 131--134.
[26]
Ken Hinckley. 2003. Synchronous gestures for multiple persons and computers. In Proceedings of the 16th Annual ACM Symposium on User Interface Software and Technology (UIST’03). ACM, New York, NY, 149--158.
[27]
Ken Hinckley, Gonzalo Ramos, Francois Guimbretiere, Patrick Baudisch, and Marc Smith. 2004. Stitching: Pen gestures that span multiple displays. In Proceedings of the Working Conference on Advanced Visual Interfaces (AVI’04). ACM, New York, NY, 23--31.
[28]
Ken Hinckley and Daniel Wigdor. 2012. The Human-Computer Interaction Handbook Fundamentals, Evolving Technologies and Emerging Applications, (3rd ed.). Input Technologies and Techniques, J. Jacko (Ed.). Taylor 8 Francis.
[29]
Lars Erik Holmquist, Friedemann Mattern, Bernt Schiele, Petteri Alahuhta, Michael Beigl5, and Hans-W. Gellersen. 2001. Smart-its friends: A technique for users to easily establish connections between smart artefacts. In Proceedings of the 3rd International Conference on Ubiquitous Computing. Springer, Berlin, 116--122.
[30]
Michael Victor Ilich. 2009. Moving Target Selection in Interactive Video. Ph.D. Dissertation. The University of British Columbia, Vancouver.
[31]
Robert J. K. Jacob. 1990. What you look at is what you get: Eye movement-based interaction techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems: Empowering People (CHI’90). ACM, 11--18.
[32]
Brett R. Jones, Hrvoje Benko, Eyal Ofek, and Andrew D. Wilson. 2013. IllumiRoom: Peripheral projected illusions for interactive experiences. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 869--878.
[33]
Dirk Kerzel, David Souto, and Nathalie E. Ziegler. 2008. Effects of attention shifts to stationary objects during steady-state smooth pursuit eye movements. Vision Research 48, 7 (2008), 958--969.
[34]
Tim Kindberg, Kan Zhang, and Seung Hyun Im. 2005. Evidently Secure Device Associations. Technical Report HPL-2005-40. HP Laboratories Bristol.
[35]
Rick Kjeldsen. 2001. Head gestures for computer control. In Proceedings of the IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 2001. 61--67.
[36]
Alfred Kuhn and William T. Powers. 1975. Behavior: The Control of Perception, Vol. 4. SAGE Publications, 306 pages.
[37]
Claire Landmann, Sofia M. Landi, Scott T. Grafton, and Valeria Della-Maggiore. 2011. fMRI Supports the sensorimotor theory of motor resonance. PLOS ONE 6, 11 (11 2011), 1--8.
[38]
Kenneth R. Leslie, Scott H. Johnson-Frey, and Scott T. Grafton. 2004. Functional imaging of face and hand imitation: Towards a motor theory of empathy. NeuroImage 21, 2 (2004), 601--607.
[39]
Jonathan Lester, Blake Hannaford, and Gaetano Borriello. 2004. “Are you with me?” -- Using accelerometers to determine if two devices are carried by the same person. In Proceedings of the 2nd International Conference on Pervasive Computing (Pervasive’04). Springer, Berlin, 33--50.
[40]
Baihua Li, Mark Maxwell, Daniel Leightley, Angela Lindsay, Wendy Johnson, and Andrew Ruck. 2014. Development of Exergame-based Virtual Trainer for Physical Therapy using Kinect. Springer Fachmedien Wiesbaden, 79--88.
[41]
Sylvain Malacria, Eric Lecolinet, and Yves Guiard. 2010. Clutch-free panning and integrated pan-zoom control on touch-sensitive surfaces: The cyclostar approach. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’10). ACM, New York, NY, 2615--2624.
[42]
Rainer Malkewitz. 1998. Head pointing and speech control as a hands-free interface to desktop computing. In Proceedings of the 3rd International ACM Conference on Assistive Technologies (Assets’98). ACM, New York, NY, 182--188.
[43]
Sébastien Maury, Sylvie Athénes, and Stéphane Chatty. 1999. Rhythmic menus: Toward interaction based on rhythm. In Proceedings of the CHI’99 Extended Abstracts on Human Factors in Computing Systems (CHI EA’99). ACM, New York, NY, 254--255.
[44]
Rene Mayrhofer and Hans Gellersen. 2007. Shake well before use: Authentication based on accelerometer data. In Proceedings of the 5th International Conference on Pervasive Computing (PERVASIVE’07). Springer-Verlag, Berlin, 144--161.
[45]
Rene Mayrhofer and Hans Gellersen. 2009. Shake well before use: Intuitive and secure pairing of mobile devices. IEEE Transactions on Mobile Computing 8, 6 (June 2009), 792--806.
[46]
Matei Negulescu, Jaime Ruiz, and Edward Lank. 2012. A recognition safety net: Bi-level threshold recognition for mobile motion gestures. In Proceedings of the 14th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI’12). ACM, New York, NY, 147--150.
[47]
Donald A. Norman and Jakob Nielsen. 2010. Gestural interfaces: A step backward in usability. Interactions 17, 5 (Sept. 2010), 46--49.
[48]
Shwetak N. Patel, Jeffrey S. Pierce, and Gregory D. Abowd. 2004. A gesture-based authentication scheme for untrusted public terminals. In Proceedings of ACM Symposium on User Interface Software 8 Technology (UIST’04). 157--160.
[49]
Ken Pfeuffer, Mélodie Vidal, Jayson Turner, Andreas Bulling, and Hans Gellersen. 2013. Pursuit calibration: Making gaze calibration less tedious and more flexible. In Proceedings of the 26th Annual ACM Symposium on User Interface Software and Technology (UIST’13). ACM, New York, NY, 261--270.
[50]
Giacomo Rizzolatti, Luciano Fadiga, Léonardo Fogassi, and Vittorio Gallese. 1999. Resonance behaviors and mirror neurons. Archives Italiennes de Biologie 137, 2 (1999), 85--100.
[51]
Mahsan Rofouei, Andrew Wilson, A. J. Brush, and Stewart Tansley. 2012. Your phone or mine?: Fusing body, touch and device sensing for multi-user device-display interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’12). ACM, New York, NY, 1915--1918.
[52]
Dominik Schmidt, Julian Seifert, Enrico Rukzio, and Hans Gellersen. 2012. A cross-device interaction style for mobiles and surfaces. In Proceedings of the Designing Interactive Systems Conference (DIS’12). ACM, New York, NY, 318--327.
[53]
Toby Sharp, Cem Keskin, Duncan Robertson, Jonathan Taylor, Jamie Shotton, David Kim, Christoph Rhemann, Ido Leichter, Alon Vinnikov, Yichen Wei, Daniel Freedman, Pushmeet Kohli, Eyal Krupka, Andrew Fitzgibbon, and Shahram Izadi. 2015. Accurate, robust, and flexible real-time hand tracking. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI’15). ACM, New York, NY, 3633--3642.
[54]
Junichi Shimizu, Juyoung Lee, Murtaza Dhuliawala, Andreas Bulling, Thad Starner, Woontack Woo, and Kai Kunze. 2016. Solar system: Smooth pursuit interactions using EOG glasses. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (UbiComp’16). ACM, New York, NY, 369--372.
[55]
Richard C. Simpson and Heidi H. Koester. 1999. Adaptive one-switch row-column scanning. IEEE Transactions on Rehabilitation Engineering 7, 4 (Dec 1999), 464--473.
[56]
Steven H. Strogatz and Ian Stewart. 1993. Coupled oscillators and biological synchronization. Scientific American 269, 6 (1993), 102--109.
[57]
Faisal Taher, Jason Alexander, John Hardy, and Eduardo Velloso. 2014. An empirical characterization of touch-gesture input-force on mobile devices. In Proceedings of the 9th ACM International Conference on Interactive Tabletops and Surfaces (ITS’14). ACM, New York, NY, 195--204.
[58]
Eduardo Velloso, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. Interactions under the desk: A characterisation of foot movements for input in a seated position. In Proceedings of the IFIP TC13 International Conference on Human-Computer Interaction (Interact’13). Springer International Publishing, 384--401.
[59]
Eduardo Velloso, Andreas Bulling, and Hans Gellersen. 2013. MotionMA: Motion modelling and analysis by demonstration. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI’13). ACM, New York, NY, 1309--1318.
[60]
Eduardo Velloso, Dominik Schmidt, Jason Alexander, Hans Gellersen, and Andreas Bulling. 2015. The feet in human--computer interaction: A survey of foot-based interaction. ACM Computing Surveys 48, 2, Article 21 (Sept. 2015), 35 pages.
[61]
Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct control of ambient devices by gaze. In Proceedings of the 2016 ACM Conference on Designing Interactive Systems (DIS’16). ACM, New York, NY, 812--817.
[62]
Roel Vertegaal, Aadil Mamuji, Changuk Sohn, and Daniel Cheng. 2005. Media eyepliances: Using eye tracking for remote control focus selection of appliances. In Proceedings of CHI’05 Extended Abstracts on Human Factors in Computing Systems (CHI EA’05). ACM, New York, NY, 1861--1864.
[63]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013b. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp’13). ACM, New York, NY, 439--448.
[64]
Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2015. Pursuits: Spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Computing and Communications 18, 4 (Jan. 2015), 8--10.
[65]
Mélodie Vidal, Ken Pfeuffer, Andreas Bulling, and Hans Gellersen. 2013a. Pursuits: Eye-based interaction with moving targets. In Proceedings of CHI’13 Extended Abstracts on Human Factors in Computing Systems (CHI EA’13). ACM, New York, NY, 3147--3150.
[66]
Claes von Hofsten. 1980. Predictive reaching for moving objects by human infants. Journal of Experimental Child Psychology 30, 3 (Dec 1980), 369--382.
[67]
John Williamson. 2006. Continuous Uncertain Interaction. Ph.D. Dissertation. University of Glasgow.
[68]
John Williamson and Roderick Murray-Smith. 2004. Pointing without a pointer. In Proceedings of CHI’04 Extended Abstracts on Human Factors in Computing Systems (CHI EA’04). ACM, New York, NY, 1407--1410.
[69]
Jacob Otto Wobbrock. 2009. TapSongs: Tapping rhythm-based passwords on a single binary sensor. In Proceedings of the 22nd Annual ACM Symposium on User Interface Software and Technology (UIST’09). ACM, New York, NY, 93--96.

Cited By

View all
  • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
  • (2024)Detection of visual pursuits using 1D convolutional neural networksPattern Recognition Letters10.1016/j.patrec.2024.01.020179:C(45-51)Online publication date: 1-Mar-2024
  • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
  • Show More Cited By

Index Terms

  1. Motion Correlation: Selecting Objects by Matching Their Movement

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Computer-Human Interaction
    ACM Transactions on Computer-Human Interaction  Volume 24, Issue 3
    June 2017
    244 pages
    ISSN:1073-0516
    EISSN:1557-7325
    DOI:10.1145/3086563
    Issue’s Table of Contents
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 28 April 2017
    Accepted: 01 March 2017
    Revised: 01 February 2017
    Received: 01 August 2016
    Published in TOCHI Volume 24, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Motion correlation
    2. eye tracking
    3. gaze interaction
    4. gesture interfaces
    5. interaction techniques
    6. motion tracking
    7. natural user interfaces

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    • Victorian State Government and Microsoft
    • Google through a Faculty Research Award
    • Microsoft Research Centre for Social Natural User Interfaces (SocialNUI)

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)109
    • Downloads (Last 6 weeks)21
    Reflects downloads up to 09 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)GazeSwitch: Automatic Eye-Head Mode Switching for Optimised Hands-Free PointingProceedings of the ACM on Human-Computer Interaction10.1145/36556018:ETRA(1-20)Online publication date: 28-May-2024
    • (2024)Detection of visual pursuits using 1D convolutional neural networksPattern Recognition Letters10.1016/j.patrec.2024.01.020179:C(45-51)Online publication date: 1-Mar-2024
    • (2024)Design recommendations of target size and tracking speed under circular and square trajectories for smooth pursuit with Euclidean algorithm in eye-control systemDisplays10.1016/j.displa.2023.10260881(102608)Online publication date: Jan-2024
    • (2024)Research on a spatial–temporal characterisation of blink-triggered eye control interactionsAdvanced Engineering Informatics10.1016/j.aei.2023.10229759(102297)Online publication date: Jan-2024
    • (2024)SoundOrbit: motion-correlation interaction with auditory orbital trajectoriesPersonal and Ubiquitous Computing10.1007/s00779-024-01818-4Online publication date: 15-Jun-2024
    • (2023)Proxemic Cursor Interactions for Touchless Widget ControlProceedings of the 2023 ACM Symposium on Spatial User Interaction10.1145/3607822.3614525(1-12)Online publication date: 13-Oct-2023
    • (2023)An End-to-End Review of Gaze Estimation and its Interactive Applications on Handheld Mobile DevicesACM Computing Surveys10.1145/360694756:2(1-38)Online publication date: 15-Sep-2023
    • (2023)GazeCast: Using Mobile Devices to Allow Gaze-based Interaction on Public DisplaysProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3589663(1-8)Online publication date: 30-May-2023
    • (2023)Comparing Dwell time, Pursuits and Gaze Gestures for Gaze Interaction on Handheld Mobile DevicesProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580871(1-17)Online publication date: 19-Apr-2023
    • (2023)Vergence Matching: Inferring Attention to Objects in 3D Environments for Gaze-Assisted SelectionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580685(1-15)Online publication date: 19-Apr-2023
    • Show More Cited By

    View Options

    Get Access

    Login options

    Full Access

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media