Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/3212721.3212816acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmocoConference Proceedingsconference-collections
research-article

Towards a model of nonverbal leadership in unstructured joint physical activity

Published: 28 June 2018 Publication History

Abstract

In this paper, we propose a set of algorithms to compute the cues of the nonverbal leadership in an unstructured joint full-body physical activity, i.e., the joint activity of two or more interacting persons who perform some movements without a predefined sequence and without a predefined leader. An example of such activity can be a contact dance improvisation.
The paper is composed of three parts: cue set, dataset and algorithms. First, we propose a cue set of nonverbal leadership which is grounded on existing literature and studies. It is composed of eight cues that characterize the nonverbal behaviors of the leader in a joint full-body physical activity.
In this paper we also introduce a new dataset. It consists of multimodal data (video, MoCap) of contact dance improvisations. Additionally, sensory deprivation conditions (vision and/or touch restraint) were introduced to collect the evidences of the various strategies used by leaders and followers during improvisation. The dataset was annotated by twenty-seven persons who carried out continuous annotation of leadership in the recorded material.
In the last part of the paper, we propose a set of algorithms that works on positional 3D data (i.e., joints' positions obtained from motion capture data of dancers). Each algorithm models one among the discussed cues of the nonverbal leadership.

References

[1]
Cigdem Beyan, Francesca Capozzi, Cristina Becchio, and Vittorio Murino. 2017. Multi-task Learning of Social Psychology Assessments and Nonverbal Features for Automatic Leadership Identification. In Proceedings of the 19th ACM International Conference on Multimodal Interaction (ICMI 2017). ACM, New York, NY, USA, 451--455.
[2]
Cigdem Beyan, Nicolò Carissimi, Francesca Capozzi, Sebastiano Vascon, Matteo Bustreo, Antonio Pierro, Cristina Becchio, and Vittorio Murino. 2016. Detecting Emergent Leader in a Meeting Environment Using Nonverbal Visual Features Only. In Proceedings of the 18th ACM International Conference on Multimodal Interaction (ICMI 2016). ACM, New York, NY, USA, 317--324.
[3]
David Colbert. 2007. Nonverbal Cues of the Leadership Selection Process: Leadership Selection in a Small Group. (2007).
[4]
Judith A. Hall, Erik J. Coats, and Lavonia Smith LeBeau. 2005. A Nonverbal Behavior and the Vertical Dimension of Social Relations: A Meta-Analysis. Psychological Bulletin 131, 6 (2005), 898--924.
[5]
Yuval Hart, Lior Noy, Rinat Feniger-Schaal, Avraham E. Mayo, and Uri Alon. 2014. Individuality and Togetherness in Joint Improvised Motion. PLOS ONE 9, 2 (02 2014), 1--8.
[6]
Nancy M. Henley. 1973. Status and sex: Some touching observations. Bulletin of the Psychonomic Society 2 (1973), 91--93.
[7]
Michael Kimmel and Emanuel Preuschl. 2016. Dynamic Coordination Patterns in Tango Argentino: A Cross-Fertilization of Subjective Explication Methods and Motion Capture. Springer International Publishing, Cham, 209--235.
[8]
Ahmet Alp Kindiroglu, Lale Akarun, and Oya Aran. 2017. Multi-domain and multi-task prediction of extraversion and leadership from meeting videos. EURASIP Journal on Image and Video Processing 2017, 1 (21 Nov 2017), 77.
[9]
Paul Lévy. 1925. Calcul des probabilités. Vol. 9. Gauthier-Villars Paris.
[10]
Fides Matzdorf. 2005. You can tell your follower where to go, but you canŠt put them there: Leadership as partnership. In 1st Annual Conference on Leadership Research Re-thinking Leadership: New Directions in the Learning and Skills Sector?
[11]
Fides Matzdorf and Ramen Sen. 2016. Demanding Followers, Empowered Leaders: Dance As An "Embodied Metaphor" For Leader-Follower-Ship. Organizational Aesthetics 5, 1 (2016), 114--130.
[12]
Barbara Mazzarino and Maurizio Mancini. 2009. The Need for Impulsivity & Smoothness-Improving HCI by Qualitatively Measuring New High-Level Human Motion Features. In SIGMAP. 62--67.
[13]
Richard L. Moreland. 2010. Are Dyads Really Groups? Small Group Research 41, 2 (2010), 251--267.
[14]
Radoslaw Niewiadomski, Ksenia Kolykhalova, Stefano Piana, Paolo Alborno, Gualtiero Volpe, and Antonio Camurri. 2018. Analysis of Movement Quality in Full-Body Physical Activities. accepted to: ACM Transaction On Interactive Intelligent Systems (2018).
[15]
Radoslaw Niewiadomski, Maurizio Mancini, Gualtiero Volpe, and Antonio Camurri. 2015. Automated Detection of Impulsive Movements in HCI. In Proceedings of the 11th Biannual Conference on Italian SIGCHI Chapter (CHItaly 2015). ACM, New York, NY, USA, 166--169.
[16]
Lior Noy, Erez Dekel, and Uri Alon. 2011. The mirror game as a paradigm for studying the dynamics of two people improvising motion together. Proceedings of the National Academy of Sciences 108, 52 (2011), 20947--20952.
[17]
Rodrigo Quian Quiroga, Thomas Kreuz, and Peter Grassberger. 2002. Event synchronization: A simple and fast method to measure synchronicity and time delay patterns. Phys. Rev. E 66 (Oct 2002), 041904. Issue 4.
[18]
Lucia Maria Sacheli, Emmanuele Tidoni, Enea Francesco Pavone, Salvatore Maria Aglioti, and Matteo Candidi. 2013. Kinematics fingerprints of leader and follower role-taking during cooperative joint actions. Experimental Brain Research 226, 4(01 May2013), 473--486.
[19]
Dairazalia Sanchez-Cortes, Oya Aran, Marianne Schmid Mast, and Daniel Gatica-Perez. 2012. A Nonverbal Behavior Approach to Identify Emergent Leaders in Small Groups. IEEE Transactions on Multimedia 14, 3 (June 2012), 816--832.
[20]
Birgit Schyns and Gisela Mohr. 2004. Nonverbal Elements of Leadership Behaviour. German Journal of Human Resource Research 18, 3 (2004).
[21]
Piotr Slowiński, Chao Zhai, Francesco Alderisio, Robin Salesse, Mathieu Gueugnon, Ludovic Marin, Benoit G. Bardy, Mario di Bernardo, and Krasimira Tsaneva-Atanasova. 2016. Dynamic similarity promotes interpersonal coordination in joint action. Journal of The Royal Society Interface 13, 116 (2016).
[22]
Giovanna Varni, Gualtiero Volpe, and Antonio Camurri. 2010. A System for Real-Time Multimodal Analysis of Nonverbal Affective Social Interaction in User-Centric Media. IEEE Transactions on Multimedia 12, 6 (Oct2010), 576--590.
[23]
Auriel Washburn, Mariana DeMarco, Simon de Vries, Kris Ariyabuddhiphongs, R. C. Schmidt, Michael J. Richardson, and Michael A. Riley. 2014. Dancersentrain more effectively than non-dancers to another actorŠs movements. Frontiers in Human Neuroscience 8 (2014), 800.
[24]
David Zeitner, Nicolas Rowe, and Brad Jackson. 2016. Embodied and Embodiary Leadership: Experiential Learning in Dance and Leadership Education. Organizational Aesthetics 5, 1 (2016), 167--187.
[25]
Chao Zhai, Francesco Alderisio, Piotr Slowinski, Krasimira Tsaneva-Atanasova, and Mario di Bernardo. 2016. Design of a Virtual Player for Joint Improvisation with Humans in the Mirror Game. PLOS ONE 11, 4 (04 2016), 1--17.

Cited By

View all
  • (2024)Sounding bodies: Exploring sonification to promote physical contactProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656749(1-3)Online publication date: 3-Jun-2024
  • (2023)Gaze Target Detection Based on Predictive Consistency EmbeddingJournal of Image and Signal Processing10.12677/JISP.2023.12201512:02(144-157)Online publication date: 2023
  • (2023)Co-Located Human–Human Interaction Analysis Using Nonverbal Cues: A SurveyACM Computing Surveys10.1145/362651656:5(1-41)Online publication date: 25-Nov-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
MOCO '18: Proceedings of the 5th International Conference on Movement and Computing
June 2018
329 pages
ISBN:9781450365048
DOI:10.1145/3212721
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

In-Cooperation

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 28 June 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Leadership
  2. dance
  3. nonverbal behavior
  4. social signal processing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Funding Sources

Conference

MOCO '18

Acceptance Rates

Overall Acceptance Rate 85 of 185 submissions, 46%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)1
Reflects downloads up to 12 Sep 2024

Other Metrics

Citations

Cited By

View all
  • (2024)Sounding bodies: Exploring sonification to promote physical contactProceedings of the 2024 International Conference on Advanced Visual Interfaces10.1145/3656650.3656749(1-3)Online publication date: 3-Jun-2024
  • (2023)Gaze Target Detection Based on Predictive Consistency EmbeddingJournal of Image and Signal Processing10.12677/JISP.2023.12201512:02(144-157)Online publication date: 2023
  • (2023)Co-Located Human–Human Interaction Analysis Using Nonverbal Cues: A SurveyACM Computing Surveys10.1145/362651656:5(1-41)Online publication date: 25-Nov-2023
  • (2022)Multimodal Across Domains Gaze Target DetectionProceedings of the 2022 International Conference on Multimodal Interaction10.1145/3536221.3556624(420-431)Online publication date: 7-Nov-2022
  • (2021)Predicting Gaze from Egocentric Social Interaction Videos and IMU DataProceedings of the 2021 International Conference on Multimodal Interaction10.1145/3462244.3479954(717-722)Online publication date: 18-Oct-2021
  • (2019)The role of respiration audio in multimodal analysis of movement qualitiesJournal on Multimodal User Interfaces10.1007/s12193-019-00302-114:1(1-15)Online publication date: 11-Apr-2019

View Options

Get Access

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media