The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Pro... more The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Project \u201cDance in the Dark\u201d (H2020 ICT Project n.645553 - http://dance.dibris.unige.it). The main objective of the DANCE project is to study and develop novel techniques and algorithms for the automated measuring of non-verbal bodily expression and the emotional qualities conveyed by human movement, in order to enable the perception of nonverbal artistic whole-body experiences to visual impaired people. In the framework of the eNTERFACE \u201915 Workshop we investigated methods for analyzing human movements in terms of expressive qualities. When analyzing an individual action we were mainly concentrating on the quality of motion and on elements suggesting different emotions. We developed a system to automatically extract several movement features and transfer them to the auditory domain through interactive sonification. We performed an experiment with 26 participants and collected...
Synchronization is a fundamental component of computational models of human behavior, at both int... more Synchronization is a fundamental component of computational models of human behavior, at both intra-personal and inter-personal level. Event synchronization analysis was originally conceived with the aim of providing a simple and robust method to measure synchronization between two time series. In this paper we propose a novel method extending the state-of-the-art of the event synchronization techniques: the Multi-Event-Class Synchronization (MECS) algorithm. MECS measures the synchronization between relevant events belonging to different event classes that are detected in multiple time series. Its motivation emerged from the need to model non-verbal multimodal signals in Human-Computer Interaction. Using MECS, synchronization can be computed between events belonging to the same class (intra-class synchronization) or between events belonging to different classes (inter-class synchronization). In the paper we also show how our technique can deal with macro-events (i.e., sets of event...
This Deliverable will be based on the outcomes of the task "T3.4. Development platform and s... more This Deliverable will be based on the outcomes of the task "T3.4. Development platform and software libraries" and aims at presenting the first collection of software libraries and applications developed by the partners within the WholoDance project. Since dance movement and behaviour analysis is preparatory for other tools, the Deliverable will be mainly focused on the description of libraries voted to the analysis of information related to dance performances. Dance practice is multimodal by nature. For this reason, the development of software and tools for multimodal data analysis is the ground for other movement and behaviour analysis tools. The Deliverable includes SW library for emotional and expressive analysis from musical signals (T3.4.1), SW Library for Emotion analysis from full-body movement and multimodal data (T3.4.2) and Multimodal analysis of qualities in individual dance (T3.4.4). Dancing can be an individual or social experience. This is the case when more...
Eyesweb patches for extraction of motion information from video, used in preparation of IEMP Coll... more Eyesweb patches for extraction of motion information from video, used in preparation of IEMP Collection.
2018 IEEE Games, Entertainment, Media Conference (GEM), 2018
Psychophysical and developmental psychology evidence shows that children have a preferential sens... more Psychophysical and developmental psychology evidence shows that children have a preferential sensory channel to learn specific concepts, highlighting the need of a multisensory approach to education. Despite such a potential for a positive impact, in education technology research multisensory learning has been often penalised and limited by design choices. In this work, we present the requirements and the design of an implementable prototype of serious-game aimed at teaching children a precise mathematical concept: angles. The game is based on the interaction between different sensory modalities, in particular movement, vision, and sound.
Proceedings of the 2018 International Conference on Advanced Visual Interfaces, 2018
Analysis of human movement data is a core topic of many research studies in human-human and human... more Analysis of human movement data is a core topic of many research studies in human-human and human-computer interaction. Whereas, on the one side, automated movement analysis is often based on the application of sophisticated computer science techniques (e.g., motion tracking from video recordings), on the other side the interdisciplinary nature of research in this area requires the availability of tools that can be used by researchers who may not have an advanced computer science expertise. This paper presents a system enabling users, who are not necessarily computer scientists, to perform motion tracking from a dataset of video recordings. The system - consisting of a set of (freely downloadable) tools accessible by means of user friendly graphical interfaces - was designed, developed, and tested in the context of a project for automated analysis of entrainment in ensemble music performance, following the needs and requirements of musicologists and psychologists.
Sensory cues enable navigation through space, as they inform us about movement properties, such a... more Sensory cues enable navigation through space, as they inform us about movement properties, such as the amount of travelled distance and the heading direction. In this study, we focused on the ability to spatially update one's position when only proprioceptive and vestibular information is available. We aimed to investigate the effect of yaw rotation on path integration across development in the absence of visual feedback. To this end, we utilized the triangle completion task: participants were guided through two legs of a triangle and asked to close the shape by walking along its third imagined leg. To test the influence of yaw rotation across development, we tested children between 6 and 11 years old (y.o.) and adults on their perceptions of angles of different degrees. Our results demonstrated that the amount of turn while executing the angle influences performance at all ages, and in some aspects, also interacted with age. Indeed, whilst adults seemed to adjust their heading towards the end of their walked path, younger children took less advantage of this strategy. The amount of disorientation the path induced also affected participants' full maturational ability to spatially navigate with no visual feedback. Increasing induced disorientation required children to be older to reach adult-level performance. Overall, these results provide novel insights on the maturation of spatial navigation-related processes.
The Interpersonal Entrainment in Music Performance Data Collection (IEMPDC) comprises six related... more The Interpersonal Entrainment in Music Performance Data Collection (IEMPDC) comprises six related corpora of music research materials: Cuban Son & Salsa (CSS), European String Quartet (ESQ), Malian Jembe (MJ), North Indian Raga (NIR), Tunisian Stambeli (TS), and Uruguayan Candombe (UC). The core data for each corpus comprises media files and computationally extracted event onset timing data. Annotation of metrical structure and code used in the preparation of the collection is also shared. The collection is unprecedented in size and level of detail and represents a significant new resource for empirical and computational research in music. In this article we introduce the main features of the data collection and the methods used in its preparation. Details of technical validation procedures and notes on data visualization are available as Appendices. We also contextualize the collection in relation to developments in Open Science and Open Data, discussing important distinctions betw...
The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Pro... more The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Project “Dance in the Dark” (H2020 ICT Project n.645553 http://dance.dibris.unige.it). The main objective of the DANCE project is to study and develop novel techniques and algorithms for the automated measuring of non-verbal bodily expression and the emotional qualities conveyed by human movement, in order to enable the perception of nonverbal artistic whole-body experiences to visual impaired people. In the framework of the eNTERFACE ’15 Workshop we investigated methods for analyzing human movements in terms of expressive qualities. When analyzing an individual action we were mainly concentrating on the quality of motion and on elements suggesting different emotions. We developed a system to automatically extract several movement features and transfer them to the auditory domain through interactive sonification.We performed an experiment with 26 participants and collected a dataset made of...
The EU H2020 ICT Project DANCE investigates how affective and social qualities of human full-body... more The EU H2020 ICT Project DANCE investigates how affective and social qualities of human full-body movements can be expressed, represented, and analysed by sound and music performance. In this paper we focus on one of the candidate movement qualities: Fluidity. An algorithm to detect Fluidity in full-body movement, and a model of interactive sonification to convey Fluidity through the auditory channel are presented. We developed a set of different sonifications: some follows the proposed sonification model, and others are based on different, in some cases opposite, rules. Our hypothesis is that our proposed sonification model is the most effective in communicating Fluidity. To confirm the hypothesis, we developed a serious game and performed an experiment with 22 participants at MOCO 2016 conference. Results suggest that the sonifications following our proposed model are the most effective in conveying Fluidity.
Proceedings of the 5th International Conference on Movement and Computing, 2018
This abstract presents a computational model and a software library for the EyesWeb XMI platform ... more This abstract presents a computational model and a software library for the EyesWeb XMI platform to measure a mid-level movement quality of particular importance to convey expressivity: Postural Tension. A whole body posture can be described by a vector containing the angles between the adjacent lines identifying feet (the line connecting the barycentre of each foot), knees, hip, trunk, shoulders, head, and gaze (eyes direction). Postural Tension is the extent at which a movement exhibits rotation of these multiple horizontal planes including spirals. The abstract presents a definition of this mid-level quality, and describe a demonstration: movement of a user is captured with a low-cost wearable device, postural tension and transmission of energy through the body are then extracted, visualized and sonified.
The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Pro... more The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Project \u201cDance in the Dark\u201d (H2020 ICT Project n.645553 - http://dance.dibris.unige.it). The main objective of the DANCE project is to study and develop novel techniques and algorithms for the automated measuring of non-verbal bodily expression and the emotional qualities conveyed by human movement, in order to enable the perception of nonverbal artistic whole-body experiences to visual impaired people. In the framework of the eNTERFACE \u201915 Workshop we investigated methods for analyzing human movements in terms of expressive qualities. When analyzing an individual action we were mainly concentrating on the quality of motion and on elements suggesting different emotions. We developed a system to automatically extract several movement features and transfer them to the auditory domain through interactive sonification. We performed an experiment with 26 participants and collected...
Synchronization is a fundamental component of computational models of human behavior, at both int... more Synchronization is a fundamental component of computational models of human behavior, at both intra-personal and inter-personal level. Event synchronization analysis was originally conceived with the aim of providing a simple and robust method to measure synchronization between two time series. In this paper we propose a novel method extending the state-of-the-art of the event synchronization techniques: the Multi-Event-Class Synchronization (MECS) algorithm. MECS measures the synchronization between relevant events belonging to different event classes that are detected in multiple time series. Its motivation emerged from the need to model non-verbal multimodal signals in Human-Computer Interaction. Using MECS, synchronization can be computed between events belonging to the same class (intra-class synchronization) or between events belonging to different classes (inter-class synchronization). In the paper we also show how our technique can deal with macro-events (i.e., sets of event...
This Deliverable will be based on the outcomes of the task "T3.4. Development platform and s... more This Deliverable will be based on the outcomes of the task "T3.4. Development platform and software libraries" and aims at presenting the first collection of software libraries and applications developed by the partners within the WholoDance project. Since dance movement and behaviour analysis is preparatory for other tools, the Deliverable will be mainly focused on the description of libraries voted to the analysis of information related to dance performances. Dance practice is multimodal by nature. For this reason, the development of software and tools for multimodal data analysis is the ground for other movement and behaviour analysis tools. The Deliverable includes SW library for emotional and expressive analysis from musical signals (T3.4.1), SW Library for Emotion analysis from full-body movement and multimodal data (T3.4.2) and Multimodal analysis of qualities in individual dance (T3.4.4). Dancing can be an individual or social experience. This is the case when more...
Eyesweb patches for extraction of motion information from video, used in preparation of IEMP Coll... more Eyesweb patches for extraction of motion information from video, used in preparation of IEMP Collection.
2018 IEEE Games, Entertainment, Media Conference (GEM), 2018
Psychophysical and developmental psychology evidence shows that children have a preferential sens... more Psychophysical and developmental psychology evidence shows that children have a preferential sensory channel to learn specific concepts, highlighting the need of a multisensory approach to education. Despite such a potential for a positive impact, in education technology research multisensory learning has been often penalised and limited by design choices. In this work, we present the requirements and the design of an implementable prototype of serious-game aimed at teaching children a precise mathematical concept: angles. The game is based on the interaction between different sensory modalities, in particular movement, vision, and sound.
Proceedings of the 2018 International Conference on Advanced Visual Interfaces, 2018
Analysis of human movement data is a core topic of many research studies in human-human and human... more Analysis of human movement data is a core topic of many research studies in human-human and human-computer interaction. Whereas, on the one side, automated movement analysis is often based on the application of sophisticated computer science techniques (e.g., motion tracking from video recordings), on the other side the interdisciplinary nature of research in this area requires the availability of tools that can be used by researchers who may not have an advanced computer science expertise. This paper presents a system enabling users, who are not necessarily computer scientists, to perform motion tracking from a dataset of video recordings. The system - consisting of a set of (freely downloadable) tools accessible by means of user friendly graphical interfaces - was designed, developed, and tested in the context of a project for automated analysis of entrainment in ensemble music performance, following the needs and requirements of musicologists and psychologists.
Sensory cues enable navigation through space, as they inform us about movement properties, such a... more Sensory cues enable navigation through space, as they inform us about movement properties, such as the amount of travelled distance and the heading direction. In this study, we focused on the ability to spatially update one's position when only proprioceptive and vestibular information is available. We aimed to investigate the effect of yaw rotation on path integration across development in the absence of visual feedback. To this end, we utilized the triangle completion task: participants were guided through two legs of a triangle and asked to close the shape by walking along its third imagined leg. To test the influence of yaw rotation across development, we tested children between 6 and 11 years old (y.o.) and adults on their perceptions of angles of different degrees. Our results demonstrated that the amount of turn while executing the angle influences performance at all ages, and in some aspects, also interacted with age. Indeed, whilst adults seemed to adjust their heading towards the end of their walked path, younger children took less advantage of this strategy. The amount of disorientation the path induced also affected participants' full maturational ability to spatially navigate with no visual feedback. Increasing induced disorientation required children to be older to reach adult-level performance. Overall, these results provide novel insights on the maturation of spatial navigation-related processes.
The Interpersonal Entrainment in Music Performance Data Collection (IEMPDC) comprises six related... more The Interpersonal Entrainment in Music Performance Data Collection (IEMPDC) comprises six related corpora of music research materials: Cuban Son & Salsa (CSS), European String Quartet (ESQ), Malian Jembe (MJ), North Indian Raga (NIR), Tunisian Stambeli (TS), and Uruguayan Candombe (UC). The core data for each corpus comprises media files and computationally extracted event onset timing data. Annotation of metrical structure and code used in the preparation of the collection is also shared. The collection is unprecedented in size and level of detail and represents a significant new resource for empirical and computational research in music. In this article we introduce the main features of the data collection and the methods used in its preparation. Details of technical validation procedures and notes on data visualization are available as Appendices. We also contextualize the collection in relation to developments in Open Science and Open Data, discussing important distinctions betw...
The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Pro... more The project was organized as a preliminary study for Use Case #1 of the Horizon 2020 Research Project “Dance in the Dark” (H2020 ICT Project n.645553 http://dance.dibris.unige.it). The main objective of the DANCE project is to study and develop novel techniques and algorithms for the automated measuring of non-verbal bodily expression and the emotional qualities conveyed by human movement, in order to enable the perception of nonverbal artistic whole-body experiences to visual impaired people. In the framework of the eNTERFACE ’15 Workshop we investigated methods for analyzing human movements in terms of expressive qualities. When analyzing an individual action we were mainly concentrating on the quality of motion and on elements suggesting different emotions. We developed a system to automatically extract several movement features and transfer them to the auditory domain through interactive sonification.We performed an experiment with 26 participants and collected a dataset made of...
The EU H2020 ICT Project DANCE investigates how affective and social qualities of human full-body... more The EU H2020 ICT Project DANCE investigates how affective and social qualities of human full-body movements can be expressed, represented, and analysed by sound and music performance. In this paper we focus on one of the candidate movement qualities: Fluidity. An algorithm to detect Fluidity in full-body movement, and a model of interactive sonification to convey Fluidity through the auditory channel are presented. We developed a set of different sonifications: some follows the proposed sonification model, and others are based on different, in some cases opposite, rules. Our hypothesis is that our proposed sonification model is the most effective in communicating Fluidity. To confirm the hypothesis, we developed a serious game and performed an experiment with 22 participants at MOCO 2016 conference. Results suggest that the sonifications following our proposed model are the most effective in conveying Fluidity.
Proceedings of the 5th International Conference on Movement and Computing, 2018
This abstract presents a computational model and a software library for the EyesWeb XMI platform ... more This abstract presents a computational model and a software library for the EyesWeb XMI platform to measure a mid-level movement quality of particular importance to convey expressivity: Postural Tension. A whole body posture can be described by a vector containing the angles between the adjacent lines identifying feet (the line connecting the barycentre of each foot), knees, hip, trunk, shoulders, head, and gaze (eyes direction). Postural Tension is the extent at which a movement exhibits rotation of these multiple horizontal planes including spirals. The abstract presents a definition of this mid-level quality, and describe a demonstration: movement of a user is captured with a low-cost wearable device, postural tension and transmission of energy through the body are then extracted, visualized and sonified.
Recent results from psychophysics and developmental psychology show that children have a preferen... more Recent results from psychophysics and developmental psychology show that children have a preferential sensory channel to learn specific concepts (spatial and/or temporal). At school, the visual channel is often the one most frequently exploited for teaching, and the other channels are left a marginal role only. However, the visual signal is not always the more powerful and effective channel for perception. In this work we explore the possibility of creating and evaluating a new methodology to teaching and a novel technology for deeper learning of arithmetic (time) and geometry (space). The main novelty of such a new technology comes from the renewed understanding of the role of communication between sensory modalities during development, that is, that specific sensory systems have specific roles for learning specific concepts. Results suggest that it is possible to open a new teaching/learning channel, personalized for each student based on child sensory skills. Multisensory interactive technology including a hardware and software platform to support this approach will be presented and discussed.
Uploads
Papers by Paolo Alborno