Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
article

Interactive motion mapping for real-time character control

Published: 01 May 2014 Publication History

Abstract

It is now possible to capture the 3D motion of the human body on consumer hardware and to puppet in real time skeleton-based virtual characters. However, many characters do not have humanoid skeletons. Characters such as spiders and caterpillars do not have boned skeletons at all, and these characters have very different shapes and motions. In general, character control under arbitrary shape and motion transformations is unsolved - how might these motions be mapped? We control characters with a method which avoids the rigging-skinning pipeline - source and target characters do not have skeletons or rigs. We use interactively-defined sparse pose correspondences to learn a mapping between arbitrary 3D point source sequences and mesh target sequences. Then, we puppet the target character in real time. We demonstrate the versatility of our method through results on diverse virtual characters with different input motion controllers. Our method provides a fast, flexible, and intuitive interface for arbitrary motion mapping which provides new ways to control characters for real-time animation.

References

[1]
{AM01}¿¿Alexa M., Müller W.: Representing animations by principal components. CGF Proc. Eurographics Volume 19, 4 2001, pp.411-418. 2.
[2]
{ASK*12}¿¿Akhter I., Simon T., Khan S., Matthews I., Sheikh Y.: Bilinear spatiotemporal basis models. ACM TOG Volume 31, 2 2012, pp.1-12. 3.
[3]
{ASKK10}¿¿Akhter I., Sheikh Y., Khan S., Kanade T.: Trajectory space: a dual representation for nonrigid structure from motion. IEEE TPAMI Volume 33, 7 2010, pp.1442-1456. 2.
[4]
{BTST11}¿¿Bharaj G., Thormählen T., Seidel H.-P., Theobalt C.: Automatically rigging multi-component characters. CGF Proc. Eurographics Volume 30, 2 2011, pp.755-764. 2.
[5]
{BVGP09}¿¿Baran I., Vlasic D., Grinspun E., Popović J.: Semantic deformation transfer. ACM TOG Proc. SIGGRAPH Volume 28, 3 2009, pp.36:1-36:6. 3.
[6]
{CH12}¿¿Cashman T.J., Hormann K.: A continuous, editable representation for deforming mesh sequences with separate signals for time, pose and shape. CGF Proc. Eurographics Volume 31, Issue 2pt4 2012, pp.735-744. 3.
[7]
{CIF12}¿¿Chen J., Izadi S., Fitzgibbon A.: Kin¿tre: animating the world with the human body</otherTitle>. In <otherTitle>Proc. UIST 2012, pp. pp.435-444. 2, 3.
[8]
{CMT*12}¿¿Coros S., Martin S., Thomaszewski B., Schumacher C., Sumner R., Gross M.: Deformable objects alive! ACM TOG Proc. SIGGRAPH Volume 31, 4 2012, pp.69:1-69:9. 3.
[9]
{dASTH09}¿¿de Aguiar E., Sigal L., Treuille A., Hodgins J.K.: Stable spaces for real-time clothing. ACM TOG Proc. SIGGRAPH Volume 29, 3 2009, pp.106:1-106:9. 3.
[10]
{DYP03}¿¿Dontcheva M., Yngve G., Popović Z.: Layered acting for character animation. ACM TOG Proc. SIGGRAPH, Siggraph 2003, pp.409. 3.
[11]
{FKY08}¿¿Feng W.-W., Kim B.-U., Yu Y.: Real-time data driven deformation using kernel canonical correlation analysis. ACM TOG Proc. SIGGRAPH Volume 27, 3 2008, pp.91:1-91:9. 3.
[12]
{GGP*00}¿¿Gleicher M., Grassia S., Popovic Z., Rosnthal S., Thingvold J.: Motion editing: Principles practice and promise. In SIGGRAPH 2000 Course Notes Volume 26 2000. 2.
[13]
{Gle98}¿¿Gleicher M.: Retargeting motion to new characters</otherTitle>. In <otherTitle>Proc. SIGGRAPH 1998, pp. pp.33-42. 2.
[14]
{HRE*08}¿¿Hecker C., Raabe B., Enslow R.W., Deweese J., Maynard J., Van Prooijen K.: Real-time motion retargeting to highly varied user-created morphologies. ACM TOG Proc. SIGGRAPH Volume 27, 3 2008, pp.27:1-27:9. 2.
[15]
{JBK*12}¿¿Jacobson A., Baran I., Kavan L., Popović J., Sorkine O.: Fast automatic skinning transformations. ACM TOG Proc. SIGGRAPH Volume 31, 4 2012, pp.77:1-77:10. 3.
[16]
{JT05}¿¿James D.L., Twigg C.D.: Skinning mesh animations. ACM TOG Proc. SIGGRAPH Volume 24, 3 2005, pp.399-407. 3.
[17]
{KRFC09}¿¿Kry P., Reveret L., Faure F., Cani M.-P.: Modal Locomotion: Animating Virtual Characters with Natural Vibrations. CGF Proc. Eurographics Volume 28, 2 2009, pp.289-298. 2.
[18]
{KS010}¿¿Kavan L., Sloan P.-P., O'Sullivan C.: Fast and efficient skinning of animated meshes. CGF Proc. Eurographics Volume 29, 2 2010, pp.327-336. 3.
[19]
{LWH*12}¿¿Levine S., Wang J.M., Haraux A., Popović Z., Koltun V.: Continuous character control with low-dimensional embeddings. ACM TOG Proc. SIGGRAPH Volume 31, 4 July 2012, pp.1-10. 3, 5, 9.
[20]
{RW06}¿¿Rasmussen C., Williams C.: Gaussian Processes for Machine Learning. Adaptative computation and machine learning series. University Press Group Limited, 2006. 6.
[21]
{SOL13}¿¿Seol Y., O'sullivan C., Lee J.: Creature features: online motion puppetry for non-human characters. In Proc. SCA 2013, SCA '13, ACM, pp. pp.213-221. 3.
[22]
{Sor09}¿¿Sorkine O.: Least-squares rigid motion using SVD</otherTitle>. <otherTitle>Technical notes, ETHZ 2009. 4.
[23]
{SP04}¿¿Sumner R.W., Popović J.: Deformation transfer for triangle meshes. ACM TOG Proc. SIGGRAPH Volume 23, 3 2004, pp.399-405. 4, 7.
[24]
{Stu98}¿¿Sturman D.: Computer puppetry</otherTitle>. <otherTitle>Computer Graphics and Applications, IEEE, February 1998. 3.
[25]
{TYAB01}¿¿Torresani L., Yang D., Alexander G., Bregler C.: Tracking and modeling non-rigid objects with rank constraints</otherTitle>. In <otherTitle>Proc. CVPR 2001, pp. pp.493-500. 2.
[26]
{VHKK12}¿¿Vögele A., Hermann M., Krüger B., Klein R.: Interactive steering of mesh animations</otherTitle>. In <otherTitle>Proc. SCA 2012, pp. pp.53-58. 2, 3, 5, 6.
[27]
{WFH08}¿¿Wang J.M., Fleet D.J., Hertzmann A.: Gaussian process dynamical models for human motion. IEEE TPAMI Volume 30, 2 2008, pp.283-298. 3.
[28]
{YAH10}¿¿Yamane K., Ariki Y., Hodgins J.: Animating non-humanoid characters with human motion data</otherTitle>. In <otherTitle>Proc. SCA 2010, pp. pp.169-178. 3, 5, 8, 9.
[29]
{ZXTD10}¿¿Zhou K., Xu W., Tong Y., Desbrun M.: Deformation transfer to multi-component objects</otherTitle>. <otherTitle>CGF Proc. Eurographics 2010, pp.319-325. 3.

Cited By

View all
  • (2024)Embodied Tentacle: Mapping Design to Control of Non-Analogous Body Parts with the Human BodyProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642340(1-19)Online publication date: 11-May-2024
  • (2023)What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movementsPersonal and Ubiquitous Computing10.1007/s00779-022-01684-y27:2(299-315)Online publication date: 1-Apr-2023
  • (2022)VCPoser: Interactive Pose Generation of Virtual Characters Corresponding to Human Pose InputProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565640(1-10)Online publication date: 29-Nov-2022
  • Show More Cited By

Index Terms

  1. Interactive motion mapping for real-time character control
    Index terms have been assigned to the content through auto-classification.

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Computer Graphics Forum
    Computer Graphics Forum  Volume 33, Issue 2
    May 2014
    509 pages
    ISSN:0167-7055
    EISSN:1467-8659
    Issue’s Table of Contents

    Publisher

    The Eurographs Association & John Wiley & Sons, Ltd.

    Chichester, United Kingdom

    Publication History

    Published: 01 May 2014

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)0
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 30 Aug 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Embodied Tentacle: Mapping Design to Control of Non-Analogous Body Parts with the Human BodyProceedings of the CHI Conference on Human Factors in Computing Systems10.1145/3613904.3642340(1-19)Online publication date: 11-May-2024
    • (2023)What is it like to be a bot? Variable perspective embodied telepresence for crowdsourcing robot movementsPersonal and Ubiquitous Computing10.1007/s00779-022-01684-y27:2(299-315)Online publication date: 1-Apr-2023
    • (2022)VCPoser: Interactive Pose Generation of Virtual Characters Corresponding to Human Pose InputProceedings of the 28th ACM Symposium on Virtual Reality Software and Technology10.1145/3562939.3565640(1-10)Online publication date: 29-Nov-2022
    • (2021)Research on a method of creating digital shadow puppets based on parameterized templatesMultimedia Tools and Applications10.1007/s11042-021-10726-180:13(20403-20422)Online publication date: 1-May-2021
    • (2020)MoveAEProceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction10.1145/3319502.3374807(481-489)Online publication date: 9-Mar-2020
    • (2019)Interactive animation generation of virtual characters using single RGB-D cameraThe Visual Computer: International Journal of Computer Graphics10.1007/s00371-019-01678-735:6-8(849-860)Online publication date: 1-Jun-2019
    • (2018)Surface based motion retargeting by preserving spatial relationshipProceedings of the 11th ACM SIGGRAPH Conference on Motion, Interaction and Games10.1145/3274247.3274507(1-11)Online publication date: 8-Nov-2018
    • (2018)VR AnimalsProceedings of the 2018 Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts10.1145/3270316.3271531(503-511)Online publication date: 23-Oct-2018
    • (2018)Master of Puppets3D Research10.1007/s13319-018-0158-y9:1(1-14)Online publication date: 1-Mar-2018
    • (2017)Performance-Based Biped Control using a Consumer Depth CameraComputer Graphics Forum10.5555/3128975.312901036:2(387-395)Online publication date: 1-May-2017
    • Show More Cited By

    View Options

    View options

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media