Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1836248.1836268acmconferencesArticle/Chapter ViewAbstractPublication PagesapgvConference Proceedingsconference-collections
research-article

Perception of linear and nonlinear motion properties using a FACS validated 3D facial model

Published: 23 July 2010 Publication History

Abstract

In this paper we present the first Facial Action Coding System (FACS) valid model to be based on dynamic 3D scans of human faces for use in graphics and psychological research. The model consists of FACS Action Unit (AU) based parameters and has been independently validated by FACS experts. Using this model, we explore the perceptual differences between linear facial motions -- represented by a linear blend shape approach -- and real facial motions that have been synthesized through the 3D facial model. Through numerical measures and visualizations, we show that this latter type of motion is geometrically nonlinear in terms of its vertices. In experiments, we explore the perceptual benefits of nonlinear motion for different AUs. Our results are insightful for designers of animation systems both in the entertainment industry and in scientific research. They reveal a significant overall benefit to using captured nonlinear geometric vertex motion over linear blend shape motion. However, our findings suggest that not all motions need to be animated nonlinearly. The advantage may depend on the type of facial action being produced and the phase of the movement.

References

[1]
3dmd. http://www.3dmd.com.
[2]
Bickel, B., Botsch, M., Angst, R., Matusik, W., Otaduy, M., Pfister, H., and Gross, M. 2007. Multi-scale capture of facial geometry and motion. ACM Trans. Graph. 26, 3, 33.
[3]
Blanz, V., and Vetter, T. 1999. A morphable model for the synthesis of 3d faces. In Proc. of ACM Siggraph.
[4]
Breidt, M., Wallraven, C., Cunningham, D. W., and Bulthoff, H. H. 2003. Facial animation based on 3d scans and motion capture. In ACM SIGGRAPH Sketches and Applications.
[5]
Cunningham, D., and Wallraven, C. 2009. Dynamic information for the recognition of conversational expressions. Journal of Vision 9, 1--17.
[6]
Curio, C., Breidt, M., Kleiner, M., Vuong, Q., Giess, M., and Bulthoff, H. 2007. Semantic 3d motion retargetting for facial animation. In Proc. of ACM APGV, 59--64.
[7]
Duncan, J. 2009. The unusual birth of benjamin button. Cinefex.
[8]
Ekman, P., Friesen, W., and Hager, J. 2002. Facial Action Coding System: Second Edition. Salt Lake City: Research Nexus eBook.
[9]
Griesser, R., Cunningham, D., Wallraven, C., and Bülthoff, H. 2007. Psychophysical investigation of facial expressions using computer animated faces. In ACM APGV, 11--18.
[10]
Guatama, T., and Hulle, M. V. 2002. A phase-based approach to the estimation of optical flow using spatial filtering. IEEE Trans. Neutral Networks 13, 5.
[11]
Jiang, F., Blanz, V., and O'Toole, A. Three-dimensional information in face representations revealed by identity aftereffects. Psychological Science 20, 3.
[12]
Knappmeyer, B., Thornton, I., and Buthoff, H. 2003. The use of facial motion and facial form during the processing of identity. Vision Research 43, 18, 1921--1936.
[13]
Krumhuber, E., and Tamarit, L. 2010. Facsgen beta version: Development and validation of a new set of action units. CISA Annual Research Forum.
[14]
Krumhuber, E., Manstead, A., Cosker, D., Rosin, P. L., and Marshall, A. D. 2007. Facial dynamics as indicators of trustworthiness and cooperative behaviour. Emotion 7, 4, 730--735.
[15]
Ma, W., Jones, A., Chiang, J., Hawkins, T., Frederiksen, S., Peers, P., Vukovic, M., Ouhyoung, M., and Debevec, P. 2008. Facial performance synthesis using deformation-driven polynomial displacement maps. ACM Trans. Graph. 27, 5, 1--10.
[16]
Morsella, J. B. E. 2008. The unconscious mind. Perspectives on Psychological Science 3, 73--79.
[17]
Parke, F. I., and Waters, K. 1996. Computer Facial Animation. A. K. Peters.
[18]
Roesch, E., Tamarit, L., Reveret, L., Grandjean, L., Sander, D., and Scherer, D. 2010. Facsgen: A tool to synthesise emotional facial expressions through systematic manipulation of facial action units. Journal of Nonverbal Behaviour (in press).
[19]
Sagar, M. 2006. Facial performance capture and expressive translation for king kong. In ACM SIGGRAPH 2006 Sketches, 26.
[20]
Sun, X., Rosin, P. L., Martin, R., and Langbein, F. 2007. Fast and effective feature-preserving mesh denoising. IEEE Trans. on Visualization and Computer Graphics 13, 5, 925--938.
[21]
Theobald, B., Matthews, I., Mangini, M., Spies, J., Brick, T., Cohn, J., and Boker, S. 2009. Mapping and manipulating facial expression. Journal of Language and Speech 52, 2, 369--386.
[22]
Wallraven, C., Breidt, M., Cunningham, D., and Bulthoff, H. 2008. Evaluating the perceptual realism of animated facial expressions. ACM Trans. Appl. Percept. 4, 4, 1--20.
[23]
Weyrich, T., Matusik, W., Pfister, H., Bickel, B., Donner, C., Tu, C., McAndless, J., Lee, J., Ngan, A., Jensen, H. W., and Gross, M. 2006. Analysis of human faces using a measurement-based skin reflectance model. ACM Trans. Graph. 35, 3, 1013--1024.
[24]
Williams, L. 1990. Performance driven facial animation. Computer Graphics 24, 4, 235--242.
[25]
Winkielman, P., and Berridge, K. 2004. Unconscious emotion. Current Directions in Psychological Science 13, 120--123.
[26]
Zhang, L., Snavely, N., Curless, B., and Seitz, S. 2004. Spacetime faces: high resolution capture for modeling and animation. ACM Trans. Graph. 23, 3, 548--558.

Cited By

View all
  • (2024)Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuliBehavior Research Methods10.3758/s13428-024-02443-y56:7(7674-7690)Online publication date: 4-Jun-2024
  • (2024)Recognizing facial expressions of emotion amid noise: A dynamic advantageJournal of Vision10.1167/jov.24.1.724:1(7)Online publication date: 10-Jan-2024
  • (2024)A preliminary characterization of the psychometric properties and generalizability of a novel social approach-avoidance paradigmMotivation and Emotion10.1007/s11031-024-10076-z48:3(278-294)Online publication date: 6-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
APGV '10: Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization
July 2010
171 pages
ISBN:9781450302487
DOI:10.1145/1836248
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 July 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. 3D dynamic facial capture and animation
  2. FACS

Qualifiers

  • Research-article

Conference

APGV '10
Sponsor:

Acceptance Rates

Overall Acceptance Rate 19 of 33 submissions, 58%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)10
  • Downloads (Last 6 weeks)0
Reflects downloads up to 16 Oct 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Can deepfakes be used to study emotion perception? A comparison of dynamic face stimuliBehavior Research Methods10.3758/s13428-024-02443-y56:7(7674-7690)Online publication date: 4-Jun-2024
  • (2024)Recognizing facial expressions of emotion amid noise: A dynamic advantageJournal of Vision10.1167/jov.24.1.724:1(7)Online publication date: 10-Jan-2024
  • (2024)A preliminary characterization of the psychometric properties and generalizability of a novel social approach-avoidance paradigmMotivation and Emotion10.1007/s11031-024-10076-z48:3(278-294)Online publication date: 6-Jul-2024
  • (2024)A Dynamic Disadvantage? Social Perceptions of Dynamic Morphed Emotions Differ from Videos and PhotosJournal of Nonverbal Behavior10.1007/s10919-023-00448-348:2(303-322)Online publication date: 13-Jan-2024
  • (2023)Nonverbal Markers of Empathy in Virtual Healthcare ProfessionalsProceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607291(1-4)Online publication date: 19-Sep-2023
  • (2023)Testing, explaining, and exploring models of facial expressions of emotionsScience Advances10.1126/sciadv.abq84219:6Online publication date: 10-Feb-2023
  • (2023)The role of facial movements in emotion recognitionNature Reviews Psychology10.1038/s44159-023-00172-12:5(283-296)Online publication date: 27-Mar-2023
  • (2021)FACSHuman, a software program for creating experimental material by modeling 3D facial expressionsBehavior Research Methods10.3758/s13428-021-01559-953:5(2252-2272)Online publication date: 6-Apr-2021
  • (2021)Learning 3DMM Deformation Coefficients for Action Unit DetectionMachine Learning and Metaheuristics Algorithms, and Applications10.1007/978-981-16-0419-5_1(1-14)Online publication date: 6-Feb-2021
  • (2020)Perception of “Live” Facial ExpressionsExperimental Psychology (Russia)Экспериментальная психология10.17759/exppsy.202013030513:3(55-73)Online publication date: 2020
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media