Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/566570.566607acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
Article

Interactive control of avatars animated with human motion data

Published: 01 July 2002 Publication History

Abstract

Real-time control of three-dimensional avatars is an important problem in the context of computer games and virtual environments. Avatar animation and control is difficult, however, because a large repertoire of avatar behaviors must be made available, and the user must be able to select from this set of behaviors, possibly with a low-dimensional input device. One appealing approach to obtaining a rich set of avatar behaviors is to collect an extended, unlabeled sequence of motion data appropriate to the application. In this paper, we show that such a motion database can be preprocessed for flexibility in behavior and efficient search and exploited for real-time avatar control. Flexibility is created by identifying plausible transitions between motion segments, and efficient search through the resulting graph structure is obtained through clustering. Three interface techniques are demonstrated for controlling avatar motion using this data structure: the user selects from a set of available choices, sketches a path through an environment, or acts out a desired motion in front of a video camera. We demonstrate the flexibility of the approach through four different applications and compare the avatar motion to directly recorded human motion.

References

[1]
ARIKAN, O., AND FORSYTH, D. A. 2002. Interactive motion generation from examples. In Proceedings of SIGGRAPH 2002.
[2]
BADLER, N. I., HOLLICK, M., AND GRANIERI, J. 1993. Real-time control of a virtual human using minimal sensors. Presence 2, 82-86.
[3]
BEN-ARIE, J., PANDIT, P., AND RAJARAM, S. S. 2001. Design of a digital library for human movement. In Proceedings of the first ACM/IEEE-CS International Conference on Digital Libraries, 300-309.
[4]
BLUMBERG, B. M., AND GALYEAN, T. A. 1995. Multi-level direction of autonomous creatures for real-time virtual environments. In Proceedings of SIGGRAPH 95, 47-54.
[5]
BLUMBERG, B. 1998. Swamped! Using plush toys to direct autonomous animated characters. In SIGGRAPH 98 Conference Abstracts and Applications, 109.
[6]
BOWDEN, R. 2000. Learning statistical models of human motion. In IEEE Workshop on Human Modelling, Analysis and Synthesis, CVPR2000.
[7]
BRADLEY, E., AND STUART, J. 1997. Using chaos to generate choreographic variations. In Proceedings of the Experimental Chaos Conference.
[8]
BRAND, M., AND HERTZMANN, A. 2000. Style machines. In Proceedings of SIGGRAPH 2000, 183-192.
[9]
BRAND, M. 1999. Shadow puppetry. In IEEE International Conference on Computer Vision, 1237-1244.
[10]
BRUDERLIN, A., AND CALVERT, T. W. 1989. Goal-directed, dynamic animation of human walking. In Computer Graphics (Proceedings of SIGGRAPH 89), vol. 23, 233-242.
[11]
BRUDERLIN, A., AND CALVERT, T. 1996. Knowledge-driven, interactive animation of human running. In Graphics Interface '96, 213-221.
[12]
BRUDERLIN, A., AND WILLIAMS, L. 1995. Motion signal processing. In Proceedings of SIGGRAPH 95, 97-104.
[13]
CASSELL, J., VILHJÁLMSSON, H. H., AND BICKMORE, T. 2001. Beat: The behavior expression animation toolkit. In Proceedings of SIGGRAPH 2001, 477-486.
[14]
CASSELL, J. 2000. Embodied conversational interface agents. Communications of the ACM, 4 (April), 70-78.
[15]
CHI, D. M., COSTA, M., ZHAO, L., AND BADLER, N. I. 2000. The emote model for effort and shape. In Proceedings of SIGGRAPH 2000, 173-182.
[16]
CHOPRA-KHULLAR, S., AND BADLER, N. I. 1999. Where to look? Automating attending behaviors of virtual human characters. In Proceedings of the third annual conference on Autonomous Agents, 16-23.
[17]
FALOUTSOS, P., VAN DE PANNE, M., AND TERZOPOULOS, D. 2001. The virtual stuntman: dynamic characters with a repertoire of autonomous motor skills. Computers & Graphics 25, 6 (December), 933-953.
[18]
FALOUTSOS, P., VAN DE PANNE, M., AND TERZOPOULOS, D. 2001. Composable controllers for physics-based character animation. In Proceedings of SIGGRAPH 2001, 251-260.
[19]
FRALEY, C., AND RAFTERY, A. E. 1998. How many clusters? Which clustering method? Answers via model-based cluster analysis. Computer Journal 41, 8, 578-588.
[20]
GALATA, A., JOHNSON, N., AND HOGG, D. 2001. Learning variable length markov models of behaviour. Computer Vision and Image Understanding (CVIU) Journal 81, 3 (March), 398-413.
[21]
GLEICHER, M. 1997. Motion editing with spacetime constraints. In 1997 Symposium on Interactive 3D Graphics, ACM SIGGRAPH, 139-148.
[22]
GLEICHER, M. 1998. Retargeting motion to new characters. In Proceedings of SIGGRAPH 98, 33-42.
[23]
GLEICHER, M. 2001. Comparing constraint-based motion editing methods. Graphical Models 63, 2, 107-123.
[24]
HODGINS, J. K., WOOTEN, W. L., BROGAN, D. C., AND O'BRIEN, J. F. 1995. Animating human athletics. In Proceedings of SIGGRAPH 95, 71-78.
[25]
HU, M. K. 1962. Visual pattern recognition by moment invariants. IRE Transactions on Information Theory 8, 2, 179-187.
[26]
KOVAR, L., GLEICHER, M., AND PIGHIN, F. 2002. Motion graphs. In Proceedings of SIGGRAPH 2002.
[27]
LAMOURET, A., AND VAN DE PANNE, M. 1996. Motion synthesis by example. In EGCAS '96: Seventh International Workshop on Computer Animation and Simulation, Eurographics.
[28]
LASZLO, J. F., VAN DE PANNE, M., AND FIUME, E. L. 1996. Limit cycle control and its application to the animation of balancing and walking. In Proceedings of SIGGRAPH 96, 155-162.
[29]
LEE, J., AND SHIN, S. Y. 1999. A hierarchical approach to interactive motion editing for human-like figures. In Proceedings of SIGGRAPH 99, 39-48.
[30]
LI, Y., WANG, T., AND SHUM, H.-Y. 2002. Motion texture: A two-level statistical model for character synthesis. In Proceedings of SIGGRAPH 2002.
[31]
MOLET, T., BOULIC, R., AND THALMANN, D. 1996. A real-time anatomical convertex for human motion capture. In EGCAS '96: Seventh International Workshop on Computer Animation and Simulation, Eurographics.
[32]
MOLET, T., AUBEL, T., GAPIN, T., CARION, S., AND LEE, E. 1999. Anyone for tennis? Presence 8, 2, 140-156.
[33]
MOLINA TANCO, L., AND HILTON, A. 2000. Realistic synthesis of novel human movements from a database of motion capture examples. In Proceedings of the Workshop on Human Motion, 137-142.
[34]
OXFORD METRIC SYSTEMS, 2002. www.vicon.com.
[35]
PERLIN, K., AND GOLDBERG, A. 1996. Improv: A system for scripting interactive actors in virtual worlds. In Proceedings of SIGGRAPH 96, 205-216.
[36]
PERLIN, K. 1995. Real time responsive animation with personality. IEEE Transactions on Visualization and Computer Graphics 1, 1 (Mar.), 5-15.
[37]
PULLEN, K., AND BREGLER, C. 2000. Animating by multi-level sampling. In Computer Animation 2000, IEEE CS Press, 36-42.
[38]
PULLEN, K., AND BREGLER, C. 2002. Motion capture assisted animation: Texturing and synthesis. In Proceedings of SIGGRAPH 2002.
[39]
ROSALES, R., ATHITSOS, V., SIGAL, L., AND SCLAROFF, S. 2001. 3D hand pose reconstruction using specialized mappings. In IEEE International Conference on Computer Vision, 378-385.
[40]
ROSE, C., COHEN, M. F., AND BODENHEIMER, B. 1998. Verbs and adverbs: Multidimensional motion interpolation. IEEE Computer Graphics & Applications 18, 5 (September - October), 32-40.
[41]
SARCOS, 2002. www.sarcos.com.
[42]
SCHÖDL, A., SZELISKI, R., SALESIN, D. H., AND ESSA, I. 2000. Video textures. In Proceedings of SIGGRAPH 2000, 489-498.
[43]
SEMWAL, S., HIGHTOWER, R., AND STANSFIELD, S. 1998. Mapping algorithms for real-time control of an avatar using eight sensors. Presence 7, 1, 1-21.
[44]
SIDENBLADH, H., BLACK, M. J., AND SIGAL, L. 2002. Implicit probabilistic models of human motion for synthesis and tracking. In European Conference on Computer Vision (ECCV).
[45]
SUN, H. C., AND METAXAS, D. N. 2001. Automating gait animation. In Proceedings of SIGGRAPH 2001, 261-270.
[46]
TARJAN, R. 1972. Depth first search and linear graph algorithms. SIAM Journal of Computing 1, 146-160.
[47]
UNUMA, M., ANJYO, K., AND TAKEUCHI, R. 1995. Fourier principles for emotion-based human figure animation. In Proceedings of SIGGRAPH 95, 91-96.
[48]
WILEY, D., AND HAHN, J. K. 1997. Interpolation synthesis of articulated figure motion. IEEE Computer Graphics and Applications 17, 6 (November), 39-45.
[49]
WITKIN, A. P., AND POPOVIĆ, Z. 1995. Motion warping. In Proceedings of SIGGRAPH 95, 105-108.
[50]
WOOTEN, W. L., AND HODGINS, J. K. 1996. Animation of human diving. Computer Graphics Forum 15, 1, 3-14.

Cited By

View all
  • (2024)Flexible Motion In-betweening with Diffusion ModelsACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657414(1-9)Online publication date: 13-Jul-2024
  • (2024)Ecological Validity and the Evaluation of Avatar Facial Animation Noise2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00019(72-79)Online publication date: 16-Mar-2024
  • (2024)Human Motion Generation: A SurveyIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.333093546:4(2430-2449)Online publication date: Apr-2024
  • Show More Cited By

Index Terms

  1. Interactive control of avatars animated with human motion data

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      SIGGRAPH '02: Proceedings of the 29th annual conference on Computer graphics and interactive techniques
      July 2002
      574 pages
      ISBN:1581135211
      DOI:10.1145/566570
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 01 July 2002

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. avatars
      2. human motion
      3. interactive control
      4. motion capture
      5. virtual environments

      Qualifiers

      • Article

      Conference

      SIGGRAPH02
      Sponsor:

      Acceptance Rates

      SIGGRAPH '02 Paper Acceptance Rate 67 of 358 submissions, 19%;
      Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)156
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 02 Sep 2024

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Flexible Motion In-betweening with Diffusion ModelsACM SIGGRAPH 2024 Conference Papers10.1145/3641519.3657414(1-9)Online publication date: 13-Jul-2024
      • (2024)Ecological Validity and the Evaluation of Avatar Facial Animation Noise2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00019(72-79)Online publication date: 16-Mar-2024
      • (2024)Human Motion Generation: A SurveyIEEE Transactions on Pattern Analysis and Machine Intelligence10.1109/TPAMI.2023.333093546:4(2430-2449)Online publication date: Apr-2024
      • (2024)Motion Generation and Analyzing the User’s Arm Muscles via Leap Motion and Its Data-Driven RepresentationsIEEE Access10.1109/ACCESS.2024.338331812(47787-47796)Online publication date: 2024
      • (2024)A Scalable Vector Graphics Warping System for Anthropomorphizing Game CharactersIEEE Access10.1109/ACCESS.2024.336918512(32472-32481)Online publication date: 2024
      • (2024)Generating Continual Human Motion in Diverse 3D Scenes2024 International Conference on 3D Vision (3DV)10.1109/3DV62453.2024.00061(903-913)Online publication date: 18-Mar-2024
      • (2023)Virtual Reality Dance Tracks from Skeletal AnimationsProceedings of the 25th Symposium on Virtual and Augmented Reality10.1145/3625008.3625035(248-253)Online publication date: 6-Nov-2023
      • (2023)SAME: Skeleton-Agnostic Motion Embedding for Character AnimationSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618206(1-11)Online publication date: 10-Dec-2023
      • (2023)Neural Motion GraphSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618181(1-11)Online publication date: 10-Dec-2023
      • (2023)Discovering Fatigued Movements for Virtual Character AnimationSIGGRAPH Asia 2023 Conference Papers10.1145/3610548.3618176(1-12)Online publication date: 10-Dec-2023
      • Show More Cited By

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media