University of Bath
Psychology
When we observe someone perform a familiar action, we can usually predict what kind of sound that action will produce. Musical actions are over-experienced by musicians and not by non-musicians, and thus offer a unique way to examine how... more
When we observe someone perform a familiar action, we can usually predict what kind of sound that action will produce. Musical actions are over-experienced by musicians and not by non-musicians, and thus offer a unique way to examine how action expertise affects brain processes when the predictability of the produced sound is manipulated. We used functional magnetic resonance imaging to scan 11 drummers and 11 age- and gender-matched novices who made judgments on point-light drumming movements presented with sound. In Experiment 1, sound was synchronized or desynchronized with drumming strikes, while in Experiment 2 sound was always synchronized, but the natural covariation between sound intensity and velocity of the drumming strike was maintained or eliminated. Prior to MRI scanning, each participant completed psychophysical testing to identify personal levels of synchronous and asynchronous timing to be used in the two fMRI activation tasks. In both experiments, the drummers' brain activation was reduced in motor and action representation brain regions when sound matched the observed movements, and was similar to that of novices when sound was mismatched. This reduction in neural activity occurred bilaterally in the cerebellum and left parahippocampal gyrus in Experiment 1, and in the right inferior parietal lobule, inferior temporal gyrus, middle frontal gyrus and precentral gyrus in Experiment 2. Our results indicate that brain functions in action-sound representation areas are modulated by multimodal action expertise.► Action expertise alters audiovisual brain mechanisms of biological motion. ► Cerebellum activity is reduced for over-learned audiovisual synchrony actions. ► Natural audiovisual covariation reduces fronto-temporo-parietal activity for experts.
- by Frank Pollick and +4
- •
- Psychophysics, Music, Magnetic Resonance Imaging, Adolescent
We investigated the effect of musical expertise on sensitivity to asynchrony for drumming point-light displays, which varied in their physical characteristics (Experiment 1) or in their degree of audiovisual congruency (Experiment 2). In... more
We investigated the effect of musical expertise on sensitivity to asynchrony for drumming point-light displays, which varied in their physical characteristics (Experiment 1) or in their degree of audiovisual congruency (Experiment 2). In Experiment 1, 21 repetitions of three tempos × three accents × nine audiovisual delays were presented to four jazz drummers and four novices. In Experiment 2, ten repetitions of two audiovisual incongruency conditions × nine audiovisual delays were presented to 13 drummers and 13 novices. Participants gave forced-choice judgments of audiovisual synchrony. The results of Experiment 1 show an enhancement in experts’ ability to detect asynchrony, especially for slower drumming tempos. In Experiment 2 an increase in sensitivity to asynchrony was found for incongruent stimuli; this increase, however, is attributable only to the novice group. Altogether the results indicated that through musical practice we learn to ignore variations in stimulus characteristics that otherwise would affect our multisensory integration processes.
- by Frank Pollick and +2
- •
- Psychophysics, Music, Perception, Auditory Perception
In the present study we applied a paradigm often used in face–voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts... more
In the present study we applied a paradigm often used in face–voice affect perception to solo music improvisation to examine how the emotional valence of sound and gesture are integrated when perceiving an emotion. Three brief excerpts expressing emotion produced by a drummer and three by a saxophonist were selected. From these bimodal congruent displays the audio-only, visual-only, and audiovisually incongruent conditions (obtained by combining the two signals both within and between instruments) were derived. In Experiment 1 twenty musical novices judged the perceived emotion and rated the strength of each emotion. The results indicate that sound dominated the visual signal in the perception of affective expression, though this was more evident for the saxophone. In Experiment 2 a further sixteen musical novices were asked to either pay attention to the musicians' movements or to the sound when judging the perceived emotions. The results showed no effect of visual information when judging the sound. On the contrary, when judging the emotional content of the visual information, a worsening in performance was obtained for the incongruent condition that combined different emotional auditory and visual information for the same instrument. The effect of emotionally discordant information thus became evident only when the auditory and visual signals belonged to the same categorical event despite their temporal mismatch. This suggests that the integration of emotional information may be reinforced by its semantic attributes but might be independent from temporal features.
- by Frank Pollick and +1
- •
- Cognitive Science, Psychophysics, Emotion, Brain
Human beings often observe other people's social interactions without being a part of them. Whereas the implications of some brain regions (e.g. amygdala) have been extensively examined, the implication of the precuneus remains yet to... more
Human beings often observe other people's social interactions without being a part of them. Whereas the implications of some brain regions (e.g. amygdala) have been extensively examined, the implication of the precuneus remains yet to be determined. Here we examined the implication of the precuneus in third-person perspective of social interaction using functional magnetic resonance imaging (fMRI). Participants performed a socially irrelevant task while watching the biological motion of two agents acting in either typical (congruent to social conventions) or atypical (incongruent to social conventions) ways. When compared to typical displays, the atypical displays elicited greater activation in the central and posterior bilateral precuneus, and in frontoparietal and occipital regions. Whereas the right precuneus responded with greater activation also to upside down than upright displays, the left precuneus did not. Correlations and effective connectivity analysis added consisten...
Sound source recognition investigates recovery of different features of the objects, whose interaction lead to the generation of the acoustical signal. Among them material type have received particular attention, while recovering of... more
Sound source recognition investigates recovery of different features of the objects, whose interaction lead to the generation of the acoustical signal. Among them material type have received particular attention, while recovering of material properties, such as hardness, have been scarcely considered. Hardness plays a significant role in the musical field too, especially for percussion instruments, where resonating objects of variable hardness are struck with mallets of variable hardness. Comparison of previous results on ...
When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark... more
When visual information is available, human adults, but not children, have been shown to reduce sensory uncertainty by taking a weighted average of sensory cues. In the absence of reliable visual information (e.g. extremely dark environment, visual disorders), the use of other information is vital. Here we ask how humans combine haptic and auditory information from childhood. In the first experiment, adults and children aged 5 to 11 years judged the relative sizes of two objects in auditory, haptic, and non-conflicting bimodal conditions. In , different groups of adults and children were tested in non-conflicting and conflicting bimodal conditions. In , adults reduced sensory uncertainty by integrating the cues optimally, while children did not. In , adults and children used similar weighting strategies to solve audio-haptic conflict. These results suggest that, in the absence of visual information, optimal integration of cues for discrimination of object size develops late in childhood.
Abstract Single cell data from macaque suggest special processing of the sights and sounds of biological actions (Kohler, Keysers, et al, Science 2002). Recently Arrighi, Alais & Burr (JOV, 2006) have examined this hypothesis... more
Abstract Single cell data from macaque suggest special processing of the sights and sounds of biological actions (Kohler, Keysers, et al, Science 2002). Recently Arrighi, Alais & Burr (JOV, 2006) have examined this hypothesis using judgments of perceptual synchrony of audio and visual streams of conga drumming as well as with synthetic audio and visual streams. The perception of audiovisual temporal synchrony provides a window on how these two different sensory modalities are integrated. To further investigate the perception ...
- by Karin Petrini and +2
- •
- Vision
Correctly localising sensory stimuli in space is a formidable challenge for the newborn brain. A new study provides a first glimpse into how human brain mechanisms for sensory remapping develop in the first year of life.
Abstract Single cell data from macaque suggest special processing of the sights and sounds of biological actions (Kohler, Keysers, et al, Science 2002). Recently Arrighi, Alais & Burr (JOV, 2006) have examined this hypothesis... more
Abstract Single cell data from macaque suggest special processing of the sights and sounds of biological actions (Kohler, Keysers, et al, Science 2002). Recently Arrighi, Alais & Burr (JOV, 2006) have examined this hypothesis using judgments of perceptual synchrony of audio and visual streams of conga drumming as well as with synthetic audio and visual streams. The perception of audiovisual temporal synchrony provides a window on how these two different sensory modalities are integrated. To further investigate the perception ...
- by C. Waadeland and +3
- •
- Vision