Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1944745.1944768acmconferencesArticle/Chapter ViewAbstractPublication Pagesi3dConference Proceedingsconference-collections
research-article

Realtime human motion control with a small number of inertial sensors

Published: 18 February 2011 Publication History

Abstract

This paper introduces an approach to performance animation that employs a small number of motion sensors to create an easy-to-use system for an interactive control of a full-body human character. Our key idea is to construct a series of online local dynamic models from a prerecorded motion database and utilize them to construct full-body human motion in a maximum a posteriori framework (MAP). We have demonstrated the effectiveness of our system by controlling a variety of human actions, such as boxing, golf swinging, and table tennis, in real time. Given an appropriate motion capture database, the results are comparable in quality to those obtained from a commercial motion capture system with a full set of motion sensors (e.g., XSens [2009]); however, our performance animation system is far less intrusive and expensive because it requires a small of motion sensors for full body control. We have also evaluated the performance of our system by leave-one-out-experiments and by comparing with two baseline algorithms.

Supplementary Material

AVI File (p133-liu.avi)

References

[1]
Arikan, O., and Forsyth, D. A. 2002. Interactive Motion Generation from Examples. In ACM Transactions on Graphics. 21(3):483--490.
[2]
Badler, N. I., Hollick, M., and Granieri, J. 1993. Realtime Control of A Virtual Human using Minimal Sensors. In Presence. 2(1):82--86.
[3]
Bazaraa, M. S., Sherali, H. D., and Shetty, C. M. 1993. Nonlinear Programming: Theory and Algorithms. John Wiley and Sons Ltd. 2nd Edition.
[4]
Brand, M., and Hertzmann, A. 2000. Style Machines. In Proceedings of ACM SIGGRAPH 2000. 183--192.
[5]
Chai, J., and Hodgins, J. 2005. Performance Animation from Low-dimensional Control Signals. In ACM Transactions on Graphics. 24(3):686--696.
[6]
Chai, J., and Hodgins, J. 2007. Constraint-based Motion Optimization Using A Statistical Dynamic Model. In ACM Transactions on Graphics. 26(3): Article No. 8.
[7]
Grochow, K., Martin, S. L., Hertzmann, A., and Popović, Z. 2004. Style-based Inverse Kinematics. In ACM Transactions on Graphics. 23(3):522--531.
[8]
Heck, R., and Gleicher, M. 2007. Parametric Motion Graphs. In Proceedings of the 2007 symposium on Interactive 3D graphics and games. 129--136.
[9]
Ishigaki, S., White, T., Zordan, V. B., and Liu, C. K. 2009. Performance-based control interface for character animation. 1--8. 28(3).
[10]
Kovar, L., and Gleicher, M. 2004. Automated Extraction and Parameterization of Motions in Large Data Sets. In ACM Transactions on Graphics. 23(3):559--568.
[11]
Kovar, L., Gleicher, M., and Pighin, F. 2002. Motion Graphs. In ACM Transactions on Graphics. 21(3):473--482.
[12]
Kwon, T., and Shin, S. Y. 2005. Motion modeling for online locomotion synthesis. In ACM SIGGRAPH Symposium on Computer Animation. 29--38.
[13]
Lau, M., Chai, J., Xu, Y.-Q., and Shum, H. 2009. Face Poser: Interactive Modeling of 3D Facial Expressions Using Facial Priors. In ACM Transactions on Graphics, Article No. 3. 29(1): article No. 3.
[14]
Lee, J., Chai, J., Reitsma, P., Hodgins, J., and Pollard, N. 2002. Interactive Control of Avatars Animated With Human Motion Data. In ACM Transactions on Graphics. 21(3):491--500.
[15]
Lee, K. H., Choi, M. G., and Lee, J. 2006. Motion patches: building blocks for virtual environments annotated with motion data. In ACM Transactions on Graphics. 25(3):898--906.
[16]
Li, Y., Wang, T., and Shum, H.-Y. 2002. Motion Texture: A Two-level Statistical Model for Character Synthesis. In ACM Transactions on Graphics. 21(3):465--472.
[17]
Lourakis, M. I. A., 2009. levmar: Levenberg-Marquardt nonlinear least squares algorithms in {C}/{C}++.
[18]
Min, J., Chen, Y.-L., and Chai, J. 2009. Interactive Generation of Human Animation with Deformable Motion Models. ACM Transactions on Graphics. 29(1): article No. 9.
[19]
Mukai, T., and Kuriyama, S. 2005. Geostatistical Motion Interpolation. In ACM Transactions on Graphics. 24(3):1062--1070.
[20]
Ren, L., Shakhnarovich, G., Hodgins, J. K., Pfister, H., and Viola, P. A. 2004. Learning Silhouette Features for Control of Human Motion. In Computer Science Technical Reports 2004, Carnegie Mellon University. CMU-CS-04-165.
[21]
Rose, C., Cohen, M. F., and Bodenheimer, B. 1998. Verbs and Adverbs: Multidimensional Motion Interpolation. In IEEE Computer Graphics and Applications. 18(5):32--40.
[22]
Safonova, A., and Hodgins, J. K. 2007. Construction and optimal search of interpolated motion graphs. In ACM Transactions on Graphics. 26(3).
[23]
Semwal, S., Hightower, R., and Stansfield, S. 1998. Mapping Algorithms for Real-time Control of An Avatar using Eight Sensors. In Presence. 7(1):1--21.
[24]
Slyper, R., and Hodgins, J. 2008. Action capture with accelerometers. In 2008 ACM SIGGRAPH / Eurographics Symposium on Computer Animation.
[25]
Systems, S. E. T., 2003. http://www.eyetoy.com.
[26]
Xsens, 2009. http://www.xsens.com.
[27]
Yin, K., and Pai, D. K. 2003. FootSee: An Interactive Animation System. In Proceedings of the 2003 ACM SIGGRAPH/Eurographics Symposium on Computer Animation. 329--338.

Cited By

View all
  • (2024)Ultra Inertial Poser: Scalable Motion Capture and Tracking from Sparse Inertial Sensors and Ultra-Wideband RangingACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657465(1-11)Online publication date: 13-Jul-2024
  • (2024)Full-body Avatar Generation for Increased Embodiment2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00328(1063-1065)Online publication date: 16-Mar-2024
  • (2024)DTP: learning to estimate full-body pose in real-time from sparse VR sensor measurementsVirtual Reality10.1007/s10055-024-01011-128:2Online publication date: 23-May-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
I3D '11: Symposium on Interactive 3D Graphics and Games
February 2011
207 pages
ISBN:9781450305655
DOI:10.1145/1944745
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 18 February 2011

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. data-driven animation
  2. inertial sensors
  3. motion capture
  4. natural user interfaces
  5. performance animation

Qualifiers

  • Research-article

Conference

I3D '11
Sponsor:
I3D '11: Symposium on Interactive 3D Graphics and Games
February 18 - 20, 2011
California, San Francisco

Acceptance Rates

I3D '11 Paper Acceptance Rate 24 of 64 submissions, 38%;
Overall Acceptance Rate 148 of 485 submissions, 31%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)43
  • Downloads (Last 6 weeks)2
Reflects downloads up to 03 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Ultra Inertial Poser: Scalable Motion Capture and Tracking from Sparse Inertial Sensors and Ultra-Wideband RangingACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657465(1-11)Online publication date: 13-Jul-2024
  • (2024)Full-body Avatar Generation for Increased Embodiment2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00328(1063-1065)Online publication date: 16-Mar-2024
  • (2024)DTP: learning to estimate full-body pose in real-time from sparse VR sensor measurementsVirtual Reality10.1007/s10055-024-01011-128:2Online publication date: 23-May-2024
  • (2023)QuestEnvSim: Environment-Aware Simulated Motion Tracking from Sparse SensorsACM SIGGRAPH 2023 Conference Proceedings10.1145/3588432.3591504(1-9)Online publication date: 23-Jul-2023
  • (2023)SmartPoser: Arm Pose Estimation with a Smartphone and Smartwatch Using UWB and IMU DataProceedings of the 36th Annual ACM Symposium on User Interface Software and Technology10.1145/3586183.3606821(1-11)Online publication date: 29-Oct-2023
  • (2023)Variational Pose Prediction with Dynamic Sample Selection from Sparse Tracking SignalsComputer Graphics Forum10.1111/cgf.1476742:2(359-369)Online publication date: 23-May-2023
  • (2023)Neural3Points: Learning to Generate Physically Realistic Full‐body Motion for Virtual Reality UsersComputer Graphics Forum10.1111/cgf.1463441:8(183-194)Online publication date: 20-Mar-2023
  • (2023)EMDB: The Electromagnetic Database of Global 3D Human Pose and Shape in the Wild2023 IEEE/CVF International Conference on Computer Vision (ICCV)10.1109/ICCV51070.2023.01345(14586-14597)Online publication date: 1-Oct-2023
  • (2023)Geometry-Incorporated Posing of a Full-Body Avatar From Sparse TrackersIEEE Access10.1109/ACCESS.2023.329932311(78858-78866)Online publication date: 2023
  • (2022)Fusion Poser: 3D Human Pose Estimation Using Sparse IMUs and Head Trackers in Real TimeSensors10.3390/s2213484622:13(4846)Online publication date: 27-Jun-2022
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media