Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Audiovisual Speech Perception in Children with Autism Spectrum Disorders

2000
ACKNOWLEDGMENTS This work was supported by NIH grants DC-007339 (Julia R. Irwin, PI) AND DC-00403 (Catherine T. Best, PI) and to Haskins Laboratories. Thanks to Cathi Best for her continued support, larry Brancazio for technical assistance and for graciously serving as speaker for stimuli and demonstration, to Jessica Grittner and Tiffany Gooding for assistance with data collection. CONCLUSIONS REFERENCES De...Read more
Audiovisual Speech Perception in Children with Autism Spectrum Disorders Julia R. Irwin Haskins Laboratories, New Haven, CT Do gaze patterns to a speaking face differ for children with ASD and TD controls? ACKNOWLEDGMENTS This work was supported by NIH grants DC-007339 (Julia R. Irwin, PI) AND DC-00403 (Catherine T. Best, PI) and to Haskins Laboratories. Thanks to Cathi Best for her continued support, larry Brancazio for technical assistance and for graciously serving as speaker for stimuli and demonstration, to Jessica Grittner and Tiffany Gooding for assistance with data collection. CONCLUSIONS REFERENCES De Gelder, B, Vroomen, J. Van Der Heide, L. (1991). Face recognition and lip-reading in autism. European Journal of Cognitive Psychology, 3, 69-86. Irwin, J.R., Whalen, D.H., & Fowler, C.A. (2006). A sex difference in visual influence on heard speech. Perception & Psychophysics, 68, 582-592. Massaro, D.W. & Boesseler, A. (2003). Perceiving speech by ear and eye: Multimodal integration by children with autism. The Journal of Developmental and Learning Disorders, 7, 111-146. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746-748. Sumby, W.H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. Journal of the Acoustical S ociety of America, 26, 212-215. Williams, J.H.G., Massaro D.W., Peel N.J., Bossler A., Suddendorf, T. (2004). Visual-Auditory integration during speech imitation in autism. Research in Developmental Disabilities, 25, 559-575 ` BACKGROUND For typically developing perceivers, visual information assists in the recognition of speech in noise (Sumby & Pollack, 1954). Visual information is also used in the perception of unambiguous auditory speech (Irwin et al., 2006). One powerful demonstration of the influence of visual information on what is heard is perceptual integration of mismatched audiovisual (AV) speech. When visual and auditory stimuli have conflicting places of articulation, perceivers may report hearing visually influenced percepts (e.g., auditory /ma/ + visual /ga/ are perceived as /na/), known as the “McGurk Effect” (McGurk & MacDonald, 1976). Autism Spectrum Disorders (ASD) refer to a continuum of severe neurodevelopmental disorders characterized by marked deficits in social reciprocity, communication, and the presence of restricted or repetitive behaviors. Children with autism spectrum disorders (ASD) appear to be less influenced by visual speech information (e.g. De Gelder et al., 1991; Massaro & Boessler, 2003; Williams et al., 2004). Given the tendency of individuals with ASD to avert gaze from the faces of others, attenuated visual influence on heard speech in ASD may reflect less visual fixation on the face of a speaker. Alternatively, children with ASD may have an underlying weakness in perception of of AV speech. In a first step to investigate these two possible underlying causes of atypical sensitivity to visual speech in ASD, we compared visual influence on heard speech in children with ASD and typically developing (TD) controls when the children were fixated on the face of a speaker. Method Does visual influence on heard speech differ for children with ASD and TD controls? Mean number of visually influenced responses for ASD and TD participants were compared for trials where gaze was fixated on the face of the speaker during consonantal closure. /ma/ 1 + /ma/ 2 = /ma/ /na/ 1 + /na/ 2 = /na/ /ma/ + /ga/ = /na/ A V AV Signal Signal Percept Visually influenced responses for trials fixating on the speaker's face 10 20 30 40 50 60 70 80 90 100 ASD TD Diagnostic Status P ercent Visually Influenced Responses ASD TD participants Procedure 2 ASD children (boys, mean age 9.25 years, range 9-9.5 years) 3 TD children (2 girls, 1 boy, mean age 9.3 years, range 7.5-10.5 years) • Normal hearing and normal or corrected vision • Native Speakers of American English • ASD participants met criteria for autism on the ADOS, ADI-R and by clinical diagnosis. QUESTION 1 Stimuli • Participants were asked to report what they heard. • Eye gaze was tracked during presentation of the matched and mismatched (McGurk) speech stimuli. • A priori “look zones” were created to assess location of gaze on the face of the speaker. • Only those trials where the participant was fixated on the face of the speaker were included in analyses. ONGOING RESEARCH a male native English speaker was videotaped producing the consonant-vowel syllables /ma/, /na/, and /ga/. These syllables were digitally edited to create matched and mismatched audiovisual (AV) stimuli: • Matched, cross-spliced syllables /ma/, /na/ • Mismatched syllables, auditory /ma/ + visual /ga/ = /na/ We are currently comparing children with ASD to both chronological age and verbal mental age matched TD controls on: • Visual influence on heard speech with matched and mismatched (McGurk) AV speech • Visual gain from auditory to AV speech in noise • Sensitivity to Asynchrony in AV speech • Identification of visual only (lipread) speech stimuli Data from this pilot project suggest: • Children with ASD are significantly less visually influenced than TD controls, even when fixated on the face of the speaker. • children with ASD may have an underlying weakness in perception of AV speech. • patterns of gaze to a speaking face do not differ between children with ASD and TD controls. QUESTION 2 Matched AV stimuli: both groups at ceiling for matched AV stimuli Mismatched (McGurk) stimuli: Children with ASD were significantly less visually influenced than TD controls [F (1,3) = 17.8, p<.02]. amount of time spent gazing at the face of the speaker did not differ by group. fixation time in look zones (eyes, mouth, entire face) did not differ by group: Mean percent of time in look zone across trials Eyes Mouth Entire face ASD 11.3 29.6 76. 4 TD 6. 7 48.6 65.3
Audiovisual Speech Perception in Children with Autism Spectrum Disorders Julia R. Irwin Haskins Laboratories, New Haven, CT BACKGROUND QUESTION 1 For typically developing perceivers, visual information assists in the recognition of speech in noise (Sumby & Pollack, 1954). Visual information is also used in the perception of unambiguous auditory speech (Irwin et al., 2006). One powerful demonstration of the influence of visual information on what is heard is perceptual integration of mismatched audiovisual (AV) speech. When visual and auditory stimuli have conflicting places of articulation, perceivers may report hearing visually influenced percepts (e.g., auditory /ma/ + visual /ga/ are perceived as /na/), known as the “McGurk Effect” (McGurk & MacDonald, 1976). Does visual influence on heard speech differ for children with ASD and TD controls? Mean number of visually influenced responses for ASD and TD participants were compared for trials where gaze was fixated on the face of the speaker during consonantal closure. Visually influenced responses for trials fixating on the speaker's face Given the tendency of individuals with ASD to avert gaze from the faces of others, attenuated visual influence on heard speech in ASD may reflect less visual fixation on the face of a speaker. Alternatively, children with ASD may have an underlying weakness in perception of of AV speech. In a first step to investigate these two possible underlying causes of atypical sensitivity to visual speech in ASD, we compared visual influence on heard speech in children with ASD and typically developing (TD) controls when the children were fixated on the face of a speaker. Percent Visually Influenced Responses 100 Autism Spectrum Disorders (ASD) refer to a continuum of severe neurodevelopmental disorders characterized by marked deficits in social reciprocity, communication, and the presence of restricted or repetitive behaviors. Children with autism spectrum disorders (ASD) appear to be less influenced by visual speech information (e.g. De Gelder et al., 1991; Massaro & Boessler, 2003; Williams et al., 2004). 90 • Matched AV stimuli: both 80 groups at ceiling for matched AV stimuli 70 ASD TD 60 • Mismatched (McGurk) stimuli: 50 Children with ASD were significantly less visually influenced than TD controls [F (1,3) = 17.8, p<.02]. 40 30 20 10 ASD Method TD Diagnostic Status QUESTION 2 participants 2 ASD children (boys, mean age 9.25 years, range 9-9.5 years) 3 TD children (2 girls, 1 boy, mean age 9.3 years, range 7.5-10.5 years) Do gaze patterns to a speaking face differ for children with ASD and TD controls? • Normal hearing and normal or corrected vision • Native Speakers of American English • ASD participants met criteria for autism on the ADOS, ADI-R and by clinical diagnosis. • amount of time spent gazing at the face of the speaker did not differ by group. • fixation time in look zones (eyes, mouth, entire face) did not Stimuli differ by group: a male native English speaker was videotaped producing the consonant-vowel syllables /ma/, /na/, and ` /ga/. These syllables were digitally edited to create matched and mismatched audiovisual (AV) stimuli: • Matched, cross-spliced syllables /ma/, /na/ • Mismatched syllables, auditory /ma/ + visual /ga/ = /na/ Mean percent of time in look zone across trials E y e s Mouth A Signal V AV Signal Percept /ma/1 + /ma/2 = /ma/ A S D 11.3 TD 6. 7 29.6 48.6 Entire face 76. 4 65.3 CONCLUSIONS Data from this pilot project suggest: /na/1 + /na/2 = /na/ • Children with ASD are significantly less visually influenced than TD controls, even when fixated on the face of the speaker. /ma/ + /ga/ = /na/ • children with ASD may have an underlying weakness in perception of AV speech. • patterns of gaze to a speaking face do not differ between children with ASD and TD controls. ONGOING RESEARCH Procedure • Participants were asked to report what they heard. • Eye gaze was tracked during presentation of the matched and mismatched (McGurk) speech stimuli. We are currently comparing children with ASD to both chronological age and verbal mental age matched TD controls on: • Visual influence on heard speech with matched and mismatched (McGurk) AV speech • Visual gain from auditory to AV speech in noise • Sensitivity to Asynchrony in AV speech • Identification of visual only (lipread) speech stimuli • A priori “look zones” were created to assess location of gaze on the face of the speaker. • Only those trials where the participant was fixated on the face of the speaker were included in analyses. ACKNOWLEDGMENTS This work was supported by NIH grants DC-007339 (Julia R. Irwin, PI) AND DC-00403 (Catherine T. Best, PI) and to Haskins Laboratories. Thanks to Cathi Best for her continued support, larry Brancazio for technical assistance and for graciously serving as speaker for stimuli and demonstration, to Jessica Grittner and Tiffany Gooding for assistance with data collection. REFERENCES De Gelder, B, Vroomen, J. Van Der Heide, L. (1991). Face recognition and lip-reading in autism. European Journal of Cognitive Psychology, 3, 69-86. Irwin, J.R., Whalen, D.H., & Fowler, C.A. (2006). A sex difference in visual influence on heard speech. Perception & Psychophysics , 68, 582-592. Massaro, D.W. & Boesseler, A. (2003). Perceiving speech by ear and eye: Multimodal integration by children with autism. The Journal of Developmental and Learning Disorders, 7, 111-146. McGurk, H., & MacDonald, J. (1976). Hearing lips and seeing voices. Nature, 264, 746-748. Sumby, W.H., & Pollack, I. (1954). Visual contribution to speech intelligibility in noise. Journal of the Acoustical S ociety of America , 26, 212-215. Williams, J.H.G., Massaro D.W., Peel N.J., Bossler A., Suddendorf, T. (2004). Visual-Auditory integration during speech imitation in autism. Research in Developmental Disabilities, 25, 559-575