26
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism.

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Conversation requires integration of information from faces and voices to fully understand the speaker's message. To detect auditory-visual asynchrony of speech, listeners must integrate visual movements of the face, particularly the mouth, with auditory speech information. Individuals with autism spectrum disorder may be less successful at such multisensory integration, despite their demonstrated preference for looking at the mouth region of a speaker. We showed participants (individuals with and without high-functioning autism (HFA) aged 8-19) a split-screen video of two identical individuals speaking side by side. Only one of the speakers was in synchrony with the corresponding audio track and synchrony switched between the two speakers every few seconds. Participants were asked to watch the video without further instructions (implicit condition) or to specifically watch the in-synch speaker (explicit condition). We recorded which part of the screen and face their eyes targeted. Both groups looked at the in-synch video significantly more with explicit instructions. However, participants with HFA looked at the in-synch video less than typically developing (TD) peers and did not increase their gaze time as much as TD participants in the explicit task. Importantly, the HFA group looked significantly less at the mouth than their TD peers, and significantly more at non-face regions of the image. There were no between-group differences for eye-directed gaze. Overall, individuals with HFA spend less time looking at the crucially important mouth region of the face during auditory-visual speech integration, which is maladaptive gaze behavior for this type of task.

          Related collections

          Author and article information

          Journal
          Autism Res
          Autism research : official journal of the International Society for Autism Research
          Wiley-Blackwell
          1939-3806
          1939-3806
          Jun 2015
          : 8
          : 3
          Affiliations
          [1 ] Emerson College, Department of Communication Sciences and Disorders, 120 Boylston Street, Boston, Massachusetts.
          [2 ] University of Massachusetts Medical School Shriver Center, 200 Trapelo Rd, Waltham, Massachusetts.
          Article
          NIHMS697331
          10.1002/aur.1447
          4474762
          25620208
          8a0db054-edfc-48b5-abdf-278c48ded9e8
          History

          audio-visual integration,eye tracking,face perception,high-functioning autism,mouth-directed gaze

          Comments

          Comment on this article