Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/566570.566629acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
Article

Eyes alive

Published: 01 July 2002 Publication History

Abstract

For an animated human face model to appear natural it should produce eye movements consistent with human ocular behavior. During face-to-face conversational interactions, eyes exhibit conversational turn-taking and agent thought processes through gaze direction, saccades, and scan patterns. We have implemented an eye movement model based on empirical models of saccades and statistical models of eye-tracking data. Face animations using stationary eyes, eyes with random saccades only, and eyes with statistically derived saccades are compared, to evaluate whether they appear natural and effective while communicating.

References

[1]
ARGYLE, M., AND COOK, M. 1976. Gaze and Mutual Gaze. Cambridge University Press, London.
[2]
ARGYLE, M., AND DEAN, J. 1965. Eye-contact, distance and affiliation. Sociometry, 28, 289-304.
[3]
BAHILL, A., ANDLER, D., AND STARK, L. 1975. Most naturally occuring human saccades have magnitudes of 15 deg or less. In Investigative Ophthalmol., 468-469.
[4]
BECKER, W. 1989. Metrics, chapter 2. In The Neurobiology of Saccadic Eye Movements, R H Wurtz and M E Goldberg (eds), 13-67.
[5]
BEELER, G. W. 1965. Stochastic processes in the human eye movement control system. PhD thesis, California Institute of Technology, Pasadena, CA.
[6]
BIZZI, E. 1972. Central programming and peripheral feedback during eye-head coordination in monkeys. In Bibl. Ophthal. 82, 220-232.
[7]
BLANZ, V., AND VETTER, T. 1999. A morphable model for the synthesis of 3D faces. In Computer Graphics (SIGGRAPH '99 Proceedings), 75-84.
[8]
BRAND, M. 1999. Voice puppetry. In Computer Graphics (SIGGRAPH '99 Proceedings, 21-28.
[9]
CANNY, J. 1986. A computational approach to edge detection. IEEE Transactions on Pattern Analysis and Machine Intelligence PAMI-8, 679-698.
[10]
CASSELL, J., PELACHAUD, C., BADLER, N., STEEDMAN, M., ACHORN, B., BECHET, T., DOUVILLE, B., PREVOST, S., AND STONE, M. 1994. Animated conversation: Rule-based generation of facial expression gesture and spoken intonation for multiple converstaional agents. In Computer Graphics (SIGGRAPH '94 Proceedings), 413-420.
[11]
CASSELL, J., TORRES, O., AND PREVOST, S. 1999. Turn taking vs. discourse structure: How best to model multimodal conversation. In In Machine Conversations, Y. Wilks (eds), 143-154.
[12]
CASSELL, J., VILHJALMSSON, H., AND BICKMORE, T. 2001. BEAT:The Behavior Expression Animation Toolkit. In Computer Graphics (SIGGRAPH '01 Proceedings), 477-486.
[13]
CHOPRA-KHULLAR, S., AND BADLER, N. 1999. Where to look? automating visual attending behaviors of virtual human characters. In Autonomous Agents Conf.
[14]
COLBURN, R., COHEN, M., AND DRUCKER, S. 2000. Avatar mediated conversational interfaces. In Microsoft Technical Report.
[15]
DECARLO, D., METAXAS, D., AND STONE, M. 1998. An anthropometric face model using variational techniques. In Computer Graphics (SIGGRAPH '98 Proceedings), 67-74.
[16]
DUNCAN, S. 1974. Some signals and rules for taking speaking turns in conversations. Oxford University Press, New York.
[17]
ESSA, I., AND PENTLAND, A. 1995. Facial expression recognition using a dynamic model and motion energy. In ICCV95, 360-367.
[18]
FAIGIN, G. 1990. The artist's complete guide to facial expression. Watson-Guptill Publications, New York.
[19]
GUENTER, B., GRIMM, C., AND WOOD, D. 1998. Making faces. In Computer Graphics (SIGGRAPH '98 Proceedings), 55-66.
[20]
ISO/IEC JTC 1/SC 29/WG11 N3055. Text for CD 14496-1 Systems MPEG-4 Manual. 1999.
[21]
ISO/IEC JTC 1/SC 29/WG11 N3056. Text for CD 14496-2 Systems MPEG-4 Manual. 1999.
[22]
KALRA, P., MANGILl, A., MAGNENAT-THALMANN, N., AND THALMANN, D. 1992. Simulation of muscle actions using rational free form deformations. In Proceedings Eurographics '92 Computer Graphics Forum, Vol. 2, No. 3, 59-69.
[23]
KENDON, A. 1967. Some functions of gaze direction in social interaction. Acta Psychologica 32, 1-25.
[24]
LEE, Y., WATERS, K., AND TERZOPOULOS, D. 1995. Realistic modeling for facial animation. In Computer Graphics (SIGGRAPH '95 Proceedings), 55-62.
[25]
LEIGH, R., AND ZEE, D. 1991. The Neurology of Eye Movements, 2 ed. FA Davis.
[26]
PARKE, F. 1974. Parametrized Models for Human Faces. PhD thesis, University of Utah.
[27]
PELACHAUD, C., BADLER, N., AND STEEDMAN, M. 1996. Generating facial expressions for speech. Cognitive Science 20, 1, 1-46.
[28]
PETAJAN, E. 1999. Very low bitrate face animation coding in MPEG-4. In Encyclopedia of Telecommunications, Volume 17, 209-231.
[29]
PIGHIN, F., HECKER, J., LISCHINSKI, D., SZELISKI, R., AND SALESIN, D. 1998. Synthesizing realistic facial expressions from photographs. In Computer Graphics (SIGGRAPH '98 Proceedings), 75-84.
[30]
PLATT, S., AND BADLER, N. 1981. Animating facial expressions. In Computer Graphics (S1GGRAPH '81 Proceedings), 279-288.
[31]
VERTEGAAL, R., DER VEER, G. V., AND VONS, H. 2000. Effects of gaze on multiparty mediated communication. In Proceedings of Graphics Interface 2000, Morgan Kaufmann Publishers, Montreal,Canada: Canadian Human-Computer Communications Society, 95-102.
[32]
VERTEGAAL, R., SLAGTER, R., DER VEER, G. V., AND NIJHOLT, A. 2000. Why conversational agents should catch the eye. In Summary of ACM CHI 2000 Conference on Human Factors in Computing Systems.
[33]
VERTEGAAL, R., SLAGTER, R., DER VEER, G. V., AND NIJHOLT, A. 2001. Eye gaze patterns in conversations: There is more to conversational agents than meets the eyes. In ACM CHI 2001 Conference on Human Factors in Computing Systems, 301-308.
[34]
WARABI, T. 1977. The reaction time of eye-head coordination in man. In Neurosci. Lett. 6, 47-51.
[35]
WATERS, K. 1987. A muscle model for animating three-dimensional facial expression. In Computer Graphics (SIGGRAPH '87 Proceedings), 17-24.

Cited By

View all
  • (2024)Eye Movement in a Controlled Dialogue SettingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653337(1-7)Online publication date: 4-Jun-2024
  • (2024)Actor Takeover of Animated Characters2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00361(1134-1135)Online publication date: 16-Mar-2024
  • (2024)Context-Aware Head-and-Eye Motion Generation with Diffusion Model2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00039(157-167)Online publication date: 16-Mar-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGGRAPH '02: Proceedings of the 29th annual conference on Computer graphics and interactive techniques
July 2002
574 pages
ISBN:1581135211
DOI:10.1145/566570
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 01 July 2002

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. HCI (human-computer interface)
  2. eye movement synthesis
  3. facial animation
  4. saccades
  5. statistical modeling

Qualifiers

  • Article

Conference

SIGGRAPH02
Sponsor:

Acceptance Rates

SIGGRAPH '02 Paper Acceptance Rate 67 of 358 submissions, 19%;
Overall Acceptance Rate 1,822 of 8,601 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)33
  • Downloads (Last 6 weeks)2
Reflects downloads up to 02 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Eye Movement in a Controlled Dialogue SettingProceedings of the 2024 Symposium on Eye Tracking Research and Applications10.1145/3649902.3653337(1-7)Online publication date: 4-Jun-2024
  • (2024)Actor Takeover of Animated Characters2024 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)10.1109/VRW62533.2024.00361(1134-1135)Online publication date: 16-Mar-2024
  • (2024)Context-Aware Head-and-Eye Motion Generation with Diffusion Model2024 IEEE Conference Virtual Reality and 3D User Interfaces (VR)10.1109/VR58804.2024.00039(157-167)Online publication date: 16-Mar-2024
  • (2024)Real-Time Multi-Map Saliency-Driven Gaze Behavior for Non-Conversational CharactersIEEE Transactions on Visualization and Computer Graphics10.1109/TVCG.2023.324467930:7(3871-3883)Online publication date: Jul-2024
  • (2024)Surveying the evolution of virtual humans expressiveness toward real humansComputers & Graphics10.1016/j.cag.2024.104034123(104034)Online publication date: Oct-2024
  • (2023)SP-EyeGAN: Generating Synthetic Eye Movement Data with Generative Adversarial NetworksProceedings of the 2023 Symposium on Eye Tracking Research and Applications10.1145/3588015.3588410(1-9)Online publication date: 30-May-2023
  • (2023)Who's next?Proceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607312(1-8)Online publication date: 19-Sep-2023
  • (2023)Crafting Realistic Virtual Humans: Unveiling Perspectives on Human Perception, Crowds, and Embodied Conversational Agents2023 36th SIBGRAPI Conference on Graphics, Patterns and Images (SIBGRAPI)10.1109/SIBGRAPI59091.2023.10347175(252-257)Online publication date: 6-Nov-2023
  • (2023)Supporting Co-Presence in Populated Virtual Environments by Actor Takeover of Animated Characters2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR59233.2023.00110(940-949)Online publication date: 16-Oct-2023
  • (2023)Exploring Subjective RealismInternational Journal of Human-Computer Studies10.1016/j.ijhcs.2023.103027175:COnline publication date: 1-Jul-2023
  • Show More Cited By

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media