Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Differences in head orientation behavior for speakers and listeners: An experiment in a virtual environment

Published: 18 January 2010 Publication History

Abstract

An experiment was conducted to investigate whether human observers use knowledge of the differences in focus of attention in multiparty interaction to identify the speaker amongst the meeting participants. A virtual environment was used to have good stimulus control. Head orientations were displayed as the only cue for focus attention. The orientations were derived from a corpus of tracked head movements. We present some properties of the relation between head orientations and speaker--listener status, as found in the corpus. With respect to the experiment, it appears that people use knowledge of the patterns in focus of attention to distinguish the speaker from the listeners. However, the human speaker identification results were rather low. Head orientations (or focus of attention) alone do not provide a sufficient cue for reliable identification of the speaker in a multiparty setting.

References

[1]
Argyle, M. 1988. Bodily Communication 2nd Ed. Routledge, London.
[2]
Argyle, M. and Cook, M. 1976. Gaze and Mutual Gaze. Cambridge University Press, Cambridge, UK.
[3]
Argyle, M. and Dean, J. 1965. Eye-contact, distance and affiliation. Sociometry 28, 3, 289--304.
[4]
Bailenson, J. N., Beall, A. C., Loomis, J., Blascovich, J., and Turk, M. 2004. Transformed social interaction: Decoupling representation from behavior and form in collaborative virtual environments. Presence 13, 4, 428--441.
[5]
Beall, A. C., Bailenson, J. N., Loomis, J., Blascovich, J., and Rex, C. S. 2003. Nonzerosum mutual gaze in collaborative virtual environments. In Proceedings of the International Conference on Human-Computer Interaction. Springer, Berlin, 1108--1112.
[6]
Cassell, J. and Thrisson, K. R. 1999. The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Appl. Artif. Intell. 13, 4, 519--538.
[7]
Cline, M. G. 1967. The perception of where a person is looking. Am. J. Psych. 80, 1, 41--50.
[8]
Colburn, R. A., Cohen, M. F., and Drucker, S. M. 2000. The role of eye gaze in avatar mediated conversational interfaces. Tech. rep. MSR-TR-2000-81, Microsoft Research.
[9]
Duncan, S. and Niederehe, G. 1974. On signalling that it's your turn to speak. J. Exp. Soc. Psych. 10, 234--247.
[10]
Exline, R. V. 1963. Explorations in the process of person perception: Visual interaction in relation to competition, sex, and need for affiliation. J. Personality 31, 1--20.
[11]
Fussell, S. R., Kraut, R. E., Gergle, D., and Setlock, L. D. 2005. Other Minds. Guilford Press, New York, 91--105.
[12]
Garau, M., Slater, M., Bee, S., and Sasse, M. A. 2001. The impact of eye-gaze on communication using humanoid avatars. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'01). ACM, New York, 309--316.
[13]
Gibson, J. J. and Pick, A. D. 1963. Perception of another person's looking behavior. Am. J. Psych. 76, 3, 386--394.
[14]
Goodwin, C. 1984. Structures of Social Action: Studies in Conversation Analysis. Cambridge University Press, Cambridge, UK, 225--246.
[15]
Heylen, D. 2006. Head gestures, gaze and the principles of conversational structure. Int. J. Hum. Rob. 3, 3, 241--267.
[16]
Heylen, D., van Es, I., van Dijk, B., and Nijholt, A. 2002. Experimenting with the gaze of a conversational agent. In Proceedings of the Workshop on Natural, Intelligent and Effective Interaction in Multimodal Dialogue Systems. ACM, New York, 93--99.
[17]
Kendon, A. 1967. Some functions of gaze direction in social interaction. Acta Psychologica 26, 22--63.
[18]
Kleinke, C. L. 1986. Gaze and eye contact: A research review. Psych. Bull. 100, 1, 78--100.
[19]
Krüger, K. and Hückstedt, B. 1969. Die Beurteilung von Blickrichtungen. Zeitschrift fur experimentelle und angewandte Psychologie 16, 452--472.
[20]
Langton, S. R. 2000. The mutual influence of gaze and head orientation in the analysis of social attention direction. Q. J. Exp. Psych. 53A, 3, 825--845.
[21]
Loomis, J. M., Blascovich, J. J., and Beall, A. C. 1999. Immersive virtual environment technology as a basic research tool in psychology. Behav. Res. Methods, Instrum. Comput. 31, 557--564.
[22]
McClave, E. Z. 2000. Linguistic functions of head movements in the context of speech. J. Pragmatics 32, 855--878.
[23]
Nielsen, G. S. 1962. Studies in Self-Confrontation. Munksgaard, Copenhagen, Denmark.
[24]
Otsuka, K., Takemae, Y., Yamato, J., and Murase, H. 2005. A probabilistic inference of multiparty-conversation structure based on Markov switching models of gaze patterns, head directions, and utterances. In Proceedings of the International Conference on Multimodal Interface (ICMI'05). ACM, New York, 191--198.
[25]
Perrett, D. I. and Emery, N. J. 1994. Understanding the intention of others from visual signals. Curr. Psych. Cognition 13, 683--694.
[26]
Poggi, I., Pelachaud, C., and de Rosis, F. 2000. Eye communication in a conversational 3D synthetic agent. Euro. J. Artif. Intell. 13, 3, 169--181.
[27]
Poppe, R., Rienks, R., and Heylen, D. 2007. Accuracy of head direction perception in tryadic situations: Experiment in a virtual environment. Perception 36, 7, 971--979.
[28]
Ray, J. J. 1990. Acquiescence and problems with forced-choice scales. J. Soc. Psych. 130, 3, 397--399.
[29]
Reidsma, D., op den Akker, R., Rienks, R., Poppe, R., Nijholt, A., Heylen, D., and Zwiers, J. 2007. Virtual meeting rooms: From observation to simulation. AI Soc. 22, 2, 133--144.
[30]
Sagiv, N. and Bentin, S. 2001. Structural encoding of human and schematic faces: Holistic and part-based processes. J. Cognitive Neurosci. 13, 7, 937--951.
[31]
Sellen, A. J. 1992. Speech patterns in video-mediated conversations. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'92). ACM, New York, 49--59.
[32]
Stiefelhagen, R. 2002. Tracking focus of attention in meetings. In Proceedings of the IEEE International Conference on Multimodal Interfaces (ICMI'02). IEEE, Los Alamitos, CA, 273--280.
[33]
Symons, L. A., Lee, K., Cedrone, C. C., and Nishimura, M. 2004. What are you looking at? Acuity for triadic eye gaze. J. Gen. Psych. 131, 4, 451--469.
[34]
Vertegaal, R., Slagter, R., van der Veer, G., and Nijholt, A. 2000. Why conversational agents should catch the eye. In Proceedings of the Conference on Human Factors in Computing Systems (CHI'00). ACM, New York, 257--258.
[35]
Vertegaal, R., van der Veer, G. C., and Vons, H. 2000. Effects of gaze on multiparty mediated communication. In Proceedings of Graphics Interface. ACM, New York, 95--102.
[36]
Weisbrod, R. M. 1965. Looking behavior in a discussion group. Tech. rep., Cornell University, Ithaca, New York.
[37]
Whittaker, S. 2002. The Handbook of Discourse Processes. Lawrence Erlbaum Associates, Mahwah, NJ, 243--286.
[38]
Wilson, H. R., Wilkinson, F., Lin, L.-M., and Castillo, M. 2000. Perception of head orientation. Vis. Res. 40, 5, 459--472.

Cited By

View all
  • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
  • (2024)A meta-analysis on study and sample characteristics modulating mock earwitness performancePsychological Research10.1007/s00426-024-01991-488:7(1923-1940)Online publication date: 3-Jul-2024
  • (2023)Who's next?Proceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607312(1-8)Online publication date: 19-Sep-2023
  • Show More Cited By

Index Terms

  1. Differences in head orientation behavior for speakers and listeners: An experiment in a virtual environment

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Applied Perception
      ACM Transactions on Applied Perception  Volume 7, Issue 1
      January 2010
      154 pages
      ISSN:1544-3558
      EISSN:1544-3965
      DOI:10.1145/1658349
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 18 January 2010
      Accepted: 01 December 2008
      Revised: 01 November 2008
      Received: 01 November 2006
      Published in TAP Volume 7, Issue 1

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Head orientation
      2. focus of attention
      3. gaze behavior
      4. multiparty conversation
      5. perception of gaze
      6. virtual environments

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Funding Sources

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)5
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 09 Jan 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)SocialCueSwitch: Towards Customizable Accessibility by Representing Social Cues in Multiple SensesExtended Abstracts of the CHI Conference on Human Factors in Computing Systems10.1145/3613905.3651109(1-7)Online publication date: 11-May-2024
      • (2024)A meta-analysis on study and sample characteristics modulating mock earwitness performancePsychological Research10.1007/s00426-024-01991-488:7(1923-1940)Online publication date: 3-Jul-2024
      • (2023)Who's next?Proceedings of the 23rd ACM International Conference on Intelligent Virtual Agents10.1145/3570945.3607312(1-8)Online publication date: 19-Sep-2023
      • (2023)Modeling Lead-Lag Structure in Facial Expression Synchrony for Social-Psychological Outcome Prediction from Negotiation Interaction2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)10.1109/ICASSPW59220.2023.10193002(1-5)Online publication date: 4-Jun-2023
      • (2023)The role of prosody and hand gestures in the perception of boundaries in speech✰ Speech Communication10.1016/j.specom.2023.05.001150:C(41-65)Online publication date: 5-Jun-2023
      • (2021)Language and Movement Synchronization in Dyadic Psychotherapeutic Interaction – A Qualitative Review and a Proposal for a ClassificationFrontiers in Psychology10.3389/fpsyg.2021.69644812Online publication date: 22-Oct-2021
      • (2018)Interpersonal SynchronyIEEE Transactions on Affective Computing10.1109/T-AFFC.2012.123:3(349-365)Online publication date: 12-Dec-2018
      • (2017)Killer Apps: Developing Novel Applications That Enhance Team Coordination, Communication, and EffectivenessSmall Group Research10.1177/104649641772174548:5(591-620)Online publication date: 28-Jul-2017
      • (2017)Prediction of Next-Utterance Timing using Head Movement in Multi-Party MeetingsProceedings of the 5th International Conference on Human Agent Interaction10.1145/3125739.3125765(181-187)Online publication date: 17-Oct-2017
      • (2017)A Multifaceted Study on Eye Contact based Speaker Identification in Three-party ConversationsProceedings of the 2017 CHI Conference on Human Factors in Computing Systems10.1145/3025453.3025644(3011-3021)Online publication date: 2-May-2017
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media