+1 Recommend
0 collections
      • Record: found
      • Abstract: found
      • Article: not found

      Intermodal attention affects the processing of the temporal alignment of audiovisual stimuli

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.


          The temporal asynchrony between inputs to different sensory modalities has been shown to be a critical factor influencing the interaction between such inputs. We used scalp-recorded event-related potentials (ERPs) to investigate the effects of attention on the processing of audiovisual multisensory stimuli as the temporal asynchrony between the auditory and visual inputs varied across the audiovisual integration window (i.e., up to 125 ms). Randomized streams of unisensory auditory stimuli, unisensory visual stimuli, and audiovisual stimuli (consisting of the temporally proximal presentation of the visual and auditory stimulus components) were presented centrally while participants attended to either the auditory or the visual modality to detect occasional target stimuli in that modality. ERPs elicited by each of the contributing sensory modalities were extracted by signal processing techniques from the combined ERP waveforms elicited by the multisensory stimuli. This was done for each of the five different 50-ms subranges of stimulus onset asynchrony (SOA: e.g., V precedes A by 125–75 ms, by 75–25 ms, etc.). The extracted ERPs for the visual inputs of the multisensory stimuli were compared among each other and with the ERPs to the unisensory visual control stimuli, separately when attention was directed to the visual or to the auditory modality. The results showed that the attention effects on the right-hemisphere visual P1 was largest when auditory and visual stimuli were temporally aligned. In contrast, the N1 attention effect was smallest at this latency, suggesting that attention may play a role in the processing of the relative temporal alignment of the constituent parts of multisensory stimuli. At longer latencies an occipital selection negativity for the attended versus unattended visual stimuli was also observed, but this effect did not vary as a function of SOA, suggesting that by that latency a stable representation of the auditory and visual stimulus components has been established.

          Related collections

          Most cited references 50

          • Record: found
          • Abstract: found
          • Article: not found

          Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study.

           M Giard,  F Péronnet (1999)
          The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to characterize the processing stages and the neural structures where multisensory interactions take place. Event-related potentials (ERPs) were recorded from 30 scalp electrodes while subjects performed a forced-choice reaction-time categorization task: At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two objects were defined by auditory features alone, visual features alone, or the combination of auditory and visual features. Subjects were more accurate and rapid at identifying multimodal than unimodal objects. Spatiotemporal analysis of ERPs and scalp current densities revealed several auditory-visual interaction components temporally, spatially, and functionally distinct before 200 msec poststimulus. The effects observed were (1) in visual areas, new neural activities (as early as 40 msec poststimulus) and modulation (amplitude decrease) of the N185 wave to unimodal visual stimulus, (2) in the auditory cortex, modulation (amplitude increase) of subcomponents of the unimodal auditory N1 wave around 90 to 110 msec, and (3) new neural activity over the right fronto-temporal area (140 to 165 msec). Furthermore, when the subjects were separated into two groups according to their dominant modality to perform the task in unimodal conditions (shortest reaction time criteria), the integration effects were found to be similar for the two groups over the nonspecific fronto-temporal areas, but they clearly differed in the sensory-specific cortices, affecting predominantly the sensory areas of the nondominant modality. Taken together, the results indicate that multisensory integration is mediated by flexible, highly adaptive physiological processes that can take place very early in the sensory processing chain and operate in both sensory-specific and nonspecific cortical structures in different ways.
            • Record: found
            • Abstract: found
            • Article: not found

            Determinants of multisensory integration in superior colliculus neurons. I. Temporal factors.

            One of the most impressive features of the central nervous system is its ability to process information from a variety of stimuli to produce an integrated, comprehensive representation of the external world. In the present study, the temporal disparity among combinations of different sensory stimuli was shown to be a critical factor influencing the integration of multisensory stimuli by superior colliculus neurons. Several temporal principles that govern multisensory integration were revealed: (1) maximal levels of response enhancement were generated by overlapping the peak discharge periods evoked by each modality; (2) the magnitude of this enhancement decayed monotonically to zero as the peak discharge periods became progressively more temporally disparate; (3) with further increases in temporal disparity, the same stimulus combinations that previously produced enhancement could often produce depression; and (4) these kinds of interactions could frequently be predicted from the discharge trains initiated by each stimulus alone. Since multisensory superior colliculus neurons project to premotor areas of the brain stem and spinal cord that control the orientation of the receptor organs (eyes, pinnae, head), they are believed to influence attentive and orientation behaviors. Therefore, it is likely that the temporal relationships of different environmental stimuli that control the activity of these neurons are also a powerful determinant of superior colliculus-mediated attentive and orientation behaviors.
              • Record: found
              • Abstract: not found
              • Article: not found

              Multisensory auditory–visual interactions during early sensory processing in humans: a high-density electrical mapping study


                Author and article information

                Exp Brain Res
                Experimental Brain Research. Experimentelle Hirnforschung. Experimentation Cerebrale
                Springer-Verlag (Berlin/Heidelberg )
                4 June 2009
                September 2009
                : 198
                : 2-3
                : 313-328
                [1 ]Cognitive Psychology Department, Vrije University Amsterdam, Amsterdam, The Netherlands
                [2 ]Department of Cognitive Psychology and Ergonomics, University of Twente, PO Box 215, 7500 AE Enschede, The Netherlands
                [3 ]Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg, Germany
                [4 ]Center for Cognitive Neuroscience and Department of Psychiatry, Duke University, Durham, NC USA
                © The Author(s) 2009
                Research Article
                Custom metadata
                © Springer-Verlag 2009


                electrophysiology, erp, multisensory, eeg, soa


                Comment on this article