100
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Multisensory Interplay Reveals Crossmodal Influences on ‘Sensory-Specific’ Brain Regions, Neural Responses, and Judgments

      1 , , 1 , 2 , ∗∗

      Neuron

      Cell Press

      Read this article at

      ScienceOpenPublisherPMC
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Although much traditional sensory research has studied each sensory modality in isolation, there has been a recent explosion of interest in causal interplay between different senses. Various techniques have now identified numerous multisensory convergence zones in the brain. Some convergence may arise surprisingly close to low-level sensory-specific cortex, and some direct connections may exist even between primary sensory cortices. A variety of multisensory phenomena have now been reported in which sensory-specific brain responses and perceptual judgments concerning one sense can be affected by relations with other senses. We survey recent progress in this multisensory field, foregrounding human studies against the background of invasive animal work and highlighting possible underlying mechanisms. These include rapid feedforward integration, possible thalamic influences, and/or feedback from multisensory regions to sensory-specific brain areas. Multisensory interplay is more prevalent than classic modular approaches assumed, and new methods are now available to determine the underlying circuits.

          Related collections

          Most cited references 138

          • Record: found
          • Abstract: found
          • Article: not found

          Repetition and the brain: neural models of stimulus-specific effects.

          One of the most robust experience-related cortical dynamics is reduced neural activity when stimuli are repeated. This reduction has been linked to performance improvements due to repetition and also used to probe functional characteristics of neural populations. However, the underlying neural mechanisms are as yet unknown. Here, we consider three models that have been proposed to account for repetition-related reductions in neural activity, and evaluate them in terms of their ability to account for the main properties of this phenomenon as measured with single-cell recordings and neuroimaging techniques. We also discuss future directions for distinguishing between these models, which will be important for understanding the neural consequences of repetition and for interpreting repetition-related effects in neuroimaging data.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            From sensation to cognition.

             M. Mesulam (1998)
            Sensory information undergoes extensive associative elaboration and attentional modulation as it becomes incorporated into the texture of cognition. This process occurs along a core synaptic hierarchy which includes the primary sensory, upstream unimodal, downstream unimodal, heteromodal, paralimbic and limbic zones of the cerebral cortex. Connections from one zone to another are reciprocal and allow higher synaptic levels to exert a feedback (top-down) influence upon earlier levels of processing. Each cortical area provides a nexus for the convergence of afferents and divergence of efferents. The resultant synaptic organization supports parallel as well as serial processing, and allows each sensory event to initiate multiple cognitive and behavioural outcomes. Upstream sectors of unimodal association areas encode basic features of sensation such as colour, motion, form and pitch. More complex contents of sensory experience such as objects, faces, word-forms, spatial locations and sound sequences become encoded within downstream sectors of unimodal areas by groups of coarsely tuned neurons. The highest synaptic levels of sensory-fugal processing are occupied by heteromodal, paralimbic and limbic cortices, collectively known as transmodal areas. The unique role of these areas is to bind multiple unimodal and other transmodal areas into distributed but integrated multimodal representations. Transmodal areas in the midtemporal cortex, Wernicke's area, the hippocampal-entorhinal complex and the posterior parietal cortex provide critical gateways for transforming perception into recognition, word-forms into meaning, scenes and events into experiences, and spatial locations into targets for exploration. All cognitive processes arise from analogous associative transformations of similar sets of sensory inputs. The differences in the resultant cognitive operation are determined by the anatomical and physiological properties of the transmodal node that acts as the critical gateway for the dominant transformation. Interconnected sets of transmodal nodes provide anatomical and computational epicentres for large-scale neurocognitive networks. In keeping with the principles of selectively distributed processing, each epicentre of a large-scale network displays a relative specialization for a specific behavioural component of its principal neurospychological domain. The destruction of transmodal epicentres causes global impairments such as multimodal anomia, neglect and amnesia, whereas their selective disconnection from relevant unimodal areas elicits modality-specific impairments such as prosopagnosia, pure word blindness and category-specific anomias. The human brain contains at least five anatomically distinct networks. The network for spatial awareness is based on transmodal epicentres in the posterior parietal cortex and the frontal eye fields; the language network on epicentres in Wernicke's and Broca's areas; the explicit memory/emotion network on epicentres in the hippocampal-entorhinal complex and the amygdala; the face-object recognition network on epicentres in the midtemporal and temporopolar cortices; and the working memory-executive function network on epicentres in the lateral prefrontal cortex and perhaps the posterior parietal cortex. Individual sensory modalities give rise to streams of processing directed to transmodal nodes belonging to each of these networks. The fidelity of sensory channels is actively protected through approximately four synaptic levels of sensory-fugal processing. The modality-specific cortices at these four synaptic levels encode the most veridical representations of experience. Attentional, motivational and emotional modulations, including those related to working memory, novelty-seeking and mental imagery, become increasingly more pronounced within downstream components of unimodal areas, where they help to create a highly edited subjective version of the world. (ABSTRACT TRUNCATED)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The ventriloquist effect results from near-optimal bimodal integration.

              Ventriloquism is the ancient art of making one's voice appear to come from elsewhere, an art exploited by the Greek and Roman oracles, and possibly earlier. We regularly experience the effect when watching television and movies, where the voices seem to emanate from the actors' lips rather than from the actual sound source. Originally, ventriloquism was explained by performers projecting sound to their puppets by special techniques, but more recently it is assumed that ventriloquism results from vision "capturing" sound. In this study we investigate spatial localization of audio-visual stimuli. When visual localization is good, vision does indeed dominate and capture sound. However, for severely blurred visual stimuli (that are poorly localized), the reverse holds: sound captures vision. For less blurred stimuli, neither sense dominates and perception follows the mean position. Precision of bimodal localization is usually better than either the visual or the auditory unimodal presentation. All the results are well explained not by one sense capturing the other, but by a simple model of optimal combination of visual and auditory information.
                Bookmark

                Author and article information

                Contributors
                Journal
                Neuron
                Neuron
                Cell Press
                0896-6273
                1097-4199
                10 January 2008
                10 January 2008
                : 57
                : 1
                : 11-23
                Affiliations
                [1 ]UCL Institute of Cognitive Neuroscience, University College London, 17 Queen Square, London WC1N 3AR, UK
                [2 ]Department of Neurology II, Otto-von-Guericke-Universität, Leipziger Str. 44, 39120 Magdeburg, Germany
                Author notes
                []Corresponding author j.driver@ 123456ucl.ac.uk
                [∗∗ ]Corresponding author toemme@ 123456med.ovgu.de
                Article
                NEURON3182
                10.1016/j.neuron.2007.12.013
                2427054
                18184561
                © 2008 ELL & Excerpta Medica.

                This document may be redistributed and reused, subject to certain conditions.

                Categories
                Review

                Neurosciences

                Comments

                Comment on this article