Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

Predicting Perceived Naturalness of Human Animations Based on Generative Movement Primitive Models

Published: 06 September 2019 Publication History
  • Get Citation Alerts
  • Abstract

    We compared the perceptual validity of human avatar walking animations driven by six different representations of human movement using a graphics Turing test. All six representations are based on movement primitives (MPs), which are predictive models of full-body movement that differ in their complexity and prediction mechanism. Assuming that humans are experts at perceiving biological movement from noisy sensory signals, it follows that these percepts should be describable by a suitably constructed Bayesian ideal observer model. We build such models from MPs and investigate if the perceived naturalness of human animations are predictable from approximate Bayesian model scores of the MPs. We found that certain MP-based representations are capable of producing movements that are perceptually indistinguishable from natural movements. Furthermore, approximate Bayesian model scores of these representations can be used to predict perceived naturalness. In particular, we could show that movement dynamics are more important for perceived naturalness of human animations than single frame poses. This indicates that perception of human animations is highly sensitive to their temporal coherence. More generally, our results add evidence for a shared MP-representation of action and perception. Even though the motivation of our work is primarily drawn from neuroscience, we expect that our results will be applicable in virtual and augmented reality settings, when perceptually plausible human avatar movements are required.

    Supplementary Material

    knopp (knopp.zip)
    Supplemental movie and image files for, Predicting Perceived Naturalness of Human Animations Based on Generative Movement Primitive Models

    References

    [1]
    Jaap Beintema and Markus Lappe. 2002. Perception of biological motion without local image motion. Proceedings of the National Academy of Sciences 99, 8 (April 2002), 5661--5663.
    [2]
    Nikolai Bernstein. 1967. The Co-ordination and Regulation of Movements. Pergamon-Press. https://books.google.de/books?id=kX5OAQAAIAAJ
    [3]
    Bennett Bertenthal and Jeannine Pinto. 1994. Global processing of biological motions. Psychological Science 5, 4 (1994), 221--225.
    [4]
    Antonino Casile and Martin A. Giese. 2005. Critical features for the recognition of biological motion. Journal of Vision 5, 4 (April 2005), 6--6.
    [5]
    Enrico Chiovetto, Cristóbal Curio, Dominik Endres, and Martin A. Giese. 2018. Perceptual integration of kinematic components in the recognition of emotional facial expressions. Journal of Vision 18, 4 (April 2018), 13--13.
    [6]
    Debora Clever, Monika Harant, Henning Koch, Katja Mombaur, and Dominik Endres. 2016. A novel approach for the generation of complex humanoid walking sequences based on a combination of optimal control and learning of movement primitives. Robotics and Autonomous Systems 83 (Sept. 2016), 287--298.
    [7]
    Debora Clever, Monika Harant, Katja Mombaur, Maximilien Naveau, Olivier Stasse, and Dominik Endres. 2017. COCoMoPL: A novel approach for humanoid walking generation combining optimal control, movement primitives and learning and its transfer to the real robot HRP-2. IEEE Robotics and Automation Letters 2, 2 (2017), 977--984.
    [8]
    Andrea d’Avella, Philippe Saltiel, and Emilio Bizzi. 2003. Combinations of muscle synergies in the construction of a natural motor behavior. Nature Neuroscience 6, 3 (March 2003), 300--308.
    [9]
    Eran Dayan, Antonino Casile, Nava Levit-Binnun, Martin A. Giese, Talma Hendler, and Tamar Flash. 2007. Neural representations of kinematic laws of motion: Evidence for action-perception coupling. Proceedings of the National Academy of Sciences 104, 51 (Dec. 2007), 20582--20587.
    [10]
    Dominik Endres, Enrico Chiovetto, and Martin A. Giese. 2013. Model selection for the extraction of movement primitives. Frontiers in Computational Neuroscience 7 (2013), 185.
    [11]
    Dominik Endres, Andrea Christensen, Lars Omlor, and Martin A. Giese. 2011. Emulating human observers with Bayesian binning: Segmentation of action streams. ACM Transactions on Applied Perception (TAP) 8, 3 (2011), 16:1--12.
    [12]
    Karl Friston. 2010. The free-energy principle: A unified brain theory? Nature Reviews Neuroscience 11, 2 (February 2010), 127--138.
    [13]
    Martin A. Giese. 2014. Biological and body motion perception. The Oxford Handbook of Perceptual Organization.
    [14]
    Martin A. Giese and Tomaso Poggio. 2000. Morphable models for the analysis and synthesis of complex motion patterns. International Journal of Computer Vision 38 (June 2000), 59--73.
    [15]
    Martin A. Giese and Tomaso Poggio. 2003. Neural mechanisms for the recognition of biological movements: Cognitive neuroscience. Nature Reviews Neuroscience 4, 3 (March 2003), 179--192.
    [16]
    Simon Giszter. 2015. Motor primitives-New data and future questions. Current Opinion in Neurobiology 33 (Aug. 2015), 156--165.
    [17]
    Simon Giszter, Emilio Bizzi, and Ferdinando A. Mussa-Ivaldi. 1992. Motor organization in the frog’s spinal cord. In Analysis and Modeling of Neural Systems, Frank H. Eeckman (Ed.). Springer US, Boston, MA, 377--392.
    [18]
    Jessica K. Hodgins, James F. O’Brien, and Jack Tumblin. 1998. Perception of human motion with different geometric models. 4, 4 (1998), 307--316.
    [19]
    Bernhard Hommel, Jochen Müsseler, Gisa Aschersleben, and Wolfgang Prinz. 2001. The theory of event coding (TEC): A framework for perception and action planning. Behavioral and Brain Sciences 24 (2001), 849--937.
    [20]
    Auke Jan Ijspeert, Jun Nakanishi, Heiko Hoffmann, Peter Pastor, and Stefan Schaal. 2013. Dynamical movement primitives: Learning attractor models for motor behaviors. Neural Computation 25, 2 (Feb. 2013), 328--373.
    [21]
    Yuri P. Ivanenko, Richard E. Poppele, and Francesco Lacquaniti. 2004. Five basic muscle activation patterns account for muscle activity during human locomotion: Basic muscle activation patterns. The Journal of Physiology 556, 1 (April 2004), 267--282.
    [22]
    Gunnar Johansson. 1994. Visual perception of biological motion and a model for its analysis. Perceiving Events and Objects 14 (1994), 185--207.
    [23]
    Eric Jones, Travis Oliphant, and Pearu Peterson. 2001. SciPy: Open source scientific tools for Python. {Online; accessed 2015-10-09}.
    [24]
    David C. Knill and Alexandre Pouget. 2004. The Bayesian brain: The role of uncertainty in neural coding and computation. Trends in Neuroscience 27 (2004).
    [25]
    Michael D. McGuigan. 2006. Graphics turing test. CoRR abs/cs/0603132 (2006).
    [26]
    Lars Omlor and Martin A. Giese. 2011. Anechoic blind source separation using Wigner marginals. Journal of Machine Learning Research 12 (2011), 1111--1148.
    [27]
    Jonathan W. Peirce. 2009. Generating stimuli for neuroscience using PsychoPy. Frontiers in Neuroinformatics 2 (2009).
    [28]
    Felix Polyakov, Eran Stark, Rotem Drori, Moshe Abeles, and Tamar Flash. 2009. Parabolic movement primitives and cortical states: Merging optimality with geometric invariance. Biological Cybernetics 100, 2 (2009), 159.
    [29]
    Wolfgang Prinz. 1997. Perception and action planning. European Journal of Cognitive Psychology 9, 2 (June 1997), 129--154.
    [30]
    Claire L. Roether, Lars Omlor, Andrea Christensen, and Martin A. Giese. 2009. Critical features for the perception of emotion from gait. Journal of Vision 9, 6 (June 2009), 15--15.
    [31]
    Stefan Schaal. 1999. Is imitation learning the route to humanoid robots? Trends in Cognitive Sciences 3, 6 (June 1999), 233--242.
    [32]
    Stefan Schaal. 2006. Dynamic movement primitives--a framework for motor control in humans and humanoid robotics. In Adaptive Motion of Animals and Machines, Hiroshi Kimura, Kazuo Tsuchiya, Akio Ishiguro, and Hartmut Witte (Eds.). Springer-Verlag, Tokyo, 261--280.
    [33]
    Krishna Shenoy, Maneesh Sahani, and Mark M. Churchland. 2013. Cortical control of arm movements: A dynamical systems perspective. 36, 1 (2013), 337--359.
    [34]
    Yun Kyoung Shin, Robert W. Proctor, and E. John Capaldi. 2010. A review of contemporary ideomotor theory. Psychological Bulletin 136, 6 (Nov. 2010), 943--974.
    [35]
    David Sussillo, Mark M. Churchland, Matthew T. Kaufman, and Krishna V. Shenoy. 2015. A neural network that finds a naturalistic solution for the production of muscle activity. Nature Neuroscience 18, 7 (2015), 1025.
    [36]
    Nick Taubert, Andrea Christensen, Dominik Endres, and Martin A. Giese. 2012. Online simulation of emotional interactive behaviors with hierarchical gaussian process dynamical models. Proceedings of the ACM Symposium on Applied Perception (ACM-SAP 2012) (2012), 25--32.
    [37]
    Emanuel Todorov and Michael I. Jordan. 2003. A minimal intervention principle for coordinated movement. In Advances in Neural Information Processing Systems 15, S. Becker, S. Thrun, and K. Obermayer (Eds.). MIT Press, 27--34. http://papers.nips.cc/paper/2195-a-minimal-intervention-principle-for-coordinated-movement.pdf.
    [38]
    Matthew Tresch, Philippe Saltiel, and Emilio Bizzi. 1999. The construction of movement by the spinal cord. Nature Neuroscience 2, 2 (Feb. 1999), 162--167.
    [39]
    Nikolaus Troje. 2013. What is biological motion? Definition, stimuli, and paradigms. Social Perception: Detection and Interpretation of Animacy, Agency, and Intention. 13--36.
    [40]
    Nikolaus F. Troje. 2002. Decomposing biological motion: A framework for analysis and synthesis of human gait patterns. Journal of Vision 2, 5 (Sept. 2002), 2--2.
    [41]
    Nikolaus F. Troje, Cord Westhoff, and Mikhail Lavrov. 2005. Person identification from biological motion: Effects of structural and kinematic cues. 67, 4 (2005), 667--675.
    [42]
    Dmytro Velychko and Dominik Endres. 2017. A method and algorithm for estimation of pose and skeleton in motion recording systems with active markers (pending patent).
    [43]
    Dmytro Velychko, Dominik Endres, Nick Taubert, and Martin A. Giese. 2014. Coupling gaussian process dynamical models with product-of-experts kernels. In Proceedings of the 24th International Conference on Artificial Neural Networks, Lecture Notes in Computer Science, Vol. 8681. Springer, 603--610.
    [44]
    Dmytro Velychko, Benjamin Knopp, and Dominik Endres. 2018. Making the coupled Gaussian process dynamical model modular and scalable with variational approximations. Entropy 20, 10 (Sept. 2018), 724.
    [45]
    Jack Meng-Chieh Wang, David J. Fleet, and Aaron Hertzmann. 2008. Gaussian process dynamical models for human motion. IEEE Transactions on Pattern Analysis and Machine Intelligence 30, 2 (Feb. 2008), 283--298.
    [46]
    Daniel M. Wolpert, Kenji Doya, and Mitsuo Kawato. 2003. A unifying computational framework for motor control and social interaction. Philosophical Transactions of the Royal Society B: Biological Sciences 358, 1431 (March 2003), 593--602.

    Cited By

    View all
    • (2024)Body movement and emotion: Investigating the impact of audiovisual tempo manipulations on emotional arousal and valenceJahrbuch Musikpsychologie10.5964/jbdgm.19132Online publication date: 31-Jul-2024
    • (2023)A primitive-based representation of dance: modulations by experience and perceptual validityJournal of Neurophysiology10.1152/jn.00161.2023130:5(1214-1225)Online publication date: 1-Nov-2023
    • (2020)Evaluating Perceptual Predictions based on Movement Primitive Models in VR- and Online-ExperimentsACM Symposium on Applied Perception 202010.1145/3385955.3407940(1-9)Online publication date: 12-Sep-2020

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Transactions on Applied Perception
    ACM Transactions on Applied Perception  Volume 16, Issue 3
    Special Issue on SAP 2019 and Regular Paper
    July 2019
    91 pages
    ISSN:1544-3558
    EISSN:1544-3965
    DOI:10.1145/3360014
    Issue’s Table of Contents
    This work is licensed under a Creative Commons Attribution International 4.0 License.

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 September 2019
    Accepted: 01 August 2019
    Received: 01 July 2019
    Published in TAP Volume 16, Issue 3

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Gaussian process dynamical model
    2. Human animation
    3. dynamical movement primitives
    4. dynamical systems
    5. movement primitives
    6. perception
    7. psychophysics

    Qualifiers

    • Research-article
    • Research
    • Refereed

    Funding Sources

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)136
    • Downloads (Last 6 weeks)18
    Reflects downloads up to 10 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Body movement and emotion: Investigating the impact of audiovisual tempo manipulations on emotional arousal and valenceJahrbuch Musikpsychologie10.5964/jbdgm.19132Online publication date: 31-Jul-2024
    • (2023)A primitive-based representation of dance: modulations by experience and perceptual validityJournal of Neurophysiology10.1152/jn.00161.2023130:5(1214-1225)Online publication date: 1-Nov-2023
    • (2020)Evaluating Perceptual Predictions based on Movement Primitive Models in VR- and Online-ExperimentsACM Symposium on Applied Perception 202010.1145/3385955.3407940(1-9)Online publication date: 12-Sep-2020
    • (2020)Walk Ratio: Perception of an Invariant Parameter of Human Walk on Virtual CharactersACM Symposium on Applied Perception 202010.1145/3385955.3407926(1-9)Online publication date: 12-Sep-2020

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Get Access

    Login options

    Full Access

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media