Blog
About

30
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Neural pathways for visual speech perception

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          This paper examines the questions, what levels of speech can be perceived visually, and how is visual speech represented by the brain? Review of the literature leads to the conclusions that every level of psycholinguistic speech structure (i.e., phonetic features, phonemes, syllables, words, and prosody) can be perceived visually, although individuals differ in their abilities to do so; and that there are visual modality-specific representations of speech qua speech in higher-level vision brain areas. That is, the visual system represents the modal patterns of visual speech. The suggestion that the auditory speech pathway receives and represents visual speech is examined in light of neuroimaging evidence on the auditory speech pathways. We outline the generally agreed-upon organization of the visual ventral and dorsal pathways and examine several types of visual processing that might be related to speech through those pathways, specifically, face and body, orthography, and sign language processing. In this context, we examine the visual speech processing literature, which reveals widespread diverse patterns of activity in posterior temporal cortices in response to visual speech stimuli. We outline a model of the visual and auditory speech pathways and make several suggestions: (1) The visual perception of speech relies on visual pathway representations of speech qua speech. (2) A proposed site of these representations, the temporal visual speech area (TVSA) has been demonstrated in posterior temporal cortex, ventral and posterior to multisensory posterior superior temporal sulcus (pSTS). (3) Given that visual speech has dynamic and configural features, its representations in feedforward visual pathways are expected to integrate these features, possibly in TVSA.

          Related collections

          Most cited references 165

          • Record: found
          • Abstract: found
          • Article: not found

          From sensation to cognition.

           M. Mesulam (1998)
          Sensory information undergoes extensive associative elaboration and attentional modulation as it becomes incorporated into the texture of cognition. This process occurs along a core synaptic hierarchy which includes the primary sensory, upstream unimodal, downstream unimodal, heteromodal, paralimbic and limbic zones of the cerebral cortex. Connections from one zone to another are reciprocal and allow higher synaptic levels to exert a feedback (top-down) influence upon earlier levels of processing. Each cortical area provides a nexus for the convergence of afferents and divergence of efferents. The resultant synaptic organization supports parallel as well as serial processing, and allows each sensory event to initiate multiple cognitive and behavioural outcomes. Upstream sectors of unimodal association areas encode basic features of sensation such as colour, motion, form and pitch. More complex contents of sensory experience such as objects, faces, word-forms, spatial locations and sound sequences become encoded within downstream sectors of unimodal areas by groups of coarsely tuned neurons. The highest synaptic levels of sensory-fugal processing are occupied by heteromodal, paralimbic and limbic cortices, collectively known as transmodal areas. The unique role of these areas is to bind multiple unimodal and other transmodal areas into distributed but integrated multimodal representations. Transmodal areas in the midtemporal cortex, Wernicke's area, the hippocampal-entorhinal complex and the posterior parietal cortex provide critical gateways for transforming perception into recognition, word-forms into meaning, scenes and events into experiences, and spatial locations into targets for exploration. All cognitive processes arise from analogous associative transformations of similar sets of sensory inputs. The differences in the resultant cognitive operation are determined by the anatomical and physiological properties of the transmodal node that acts as the critical gateway for the dominant transformation. Interconnected sets of transmodal nodes provide anatomical and computational epicentres for large-scale neurocognitive networks. In keeping with the principles of selectively distributed processing, each epicentre of a large-scale network displays a relative specialization for a specific behavioural component of its principal neurospychological domain. The destruction of transmodal epicentres causes global impairments such as multimodal anomia, neglect and amnesia, whereas their selective disconnection from relevant unimodal areas elicits modality-specific impairments such as prosopagnosia, pure word blindness and category-specific anomias. The human brain contains at least five anatomically distinct networks. The network for spatial awareness is based on transmodal epicentres in the posterior parietal cortex and the frontal eye fields; the language network on epicentres in Wernicke's and Broca's areas; the explicit memory/emotion network on epicentres in the hippocampal-entorhinal complex and the amygdala; the face-object recognition network on epicentres in the midtemporal and temporopolar cortices; and the working memory-executive function network on epicentres in the lateral prefrontal cortex and perhaps the posterior parietal cortex. Individual sensory modalities give rise to streams of processing directed to transmodal nodes belonging to each of these networks. The fidelity of sensory channels is actively protected through approximately four synaptic levels of sensory-fugal processing. The modality-specific cortices at these four synaptic levels encode the most veridical representations of experience. Attentional, motivational and emotional modulations, including those related to working memory, novelty-seeking and mental imagery, become increasingly more pronounced within downstream components of unimodal areas, where they help to create a highly edited subjective version of the world. (ABSTRACT TRUNCATED)
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Ventral and dorsal pathways for language.

            Built on an analogy between the visual and auditory systems, the following dual stream model for language processing was suggested recently: a dorsal stream is involved in mapping sound to articulation, and a ventral stream in mapping sound to meaning. The goal of the study presented here was to test the neuroanatomical basis of this model. Combining functional magnetic resonance imaging (fMRI) with a novel diffusion tensor imaging (DTI)-based tractography method we were able to identify the most probable anatomical pathways connecting brain regions activated during two prototypical language tasks. Sublexical repetition of speech is subserved by a dorsal pathway, connecting the superior temporal lobe and premotor cortices in the frontal lobe via the arcuate and superior longitudinal fascicle. In contrast, higher-level language comprehension is mediated by a ventral pathway connecting the middle temporal lobe and the ventrolateral prefrontal cortex via the extreme capsule. Thus, according to our findings, the function of the dorsal route, traditionally considered to be the major language pathway, is mainly restricted to sensory-motor mapping of sound to articulation, whereas linguistic processing of sound to meaning requires temporofrontal interaction transmitted via the ventral route.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Is neocortex essentially multisensory?

              Although sensory perception and neurobiology are traditionally investigated one modality at a time, real world behaviour and perception are driven by the integration of information from multiple sensory sources. Mounting evidence suggests that the neural underpinnings of multisensory integration extend into early sensory processing. This article examines the notion that neocortical operations are essentially multisensory. We first review what is known about multisensory processing in higher-order association cortices and then discuss recent anatomical and physiological findings in presumptive unimodal sensory areas. The pervasiveness of multisensory influences on all levels of cortical processing compels us to reconsider thinking about neural processing in unisensory terms. Indeed, the multisensory nature of most, possibly all, of the neocortex forces us to abandon the notion that the senses ever operate independently during real-world cognition.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                01 December 2014
                2014
                : 8
                Affiliations
                1Department of Speech and Hearing Sciences, George Washington University Washington, DC, USA
                2Department of Neurology, Medical College of Wisconsin Milwaukee, WI, USA
                3Department of Psychiatry, Brigham and Women's Hospital Boston, MA, USA
                Author notes

                Edited by: Josef P. Rauschecker, Georgetown University School of Medicine, USA

                Reviewed by: Ruth Campbell, University College London, UK; Josef P. Rauschecker, Georgetown University School of Medicine, USA; Kaisa Tiippana, University of Helsinki, Finland

                *Correspondence: Lynne E. Bernstein, Communication Neuroscience Laboratory, Department of Speech and Hearing Science, George Washington University, 550 Rome Hall, 810 22nd Street, NW Washington, DC 20052, USA e-mail: lbernste@ 123456gwu.edu

                This article was submitted to the journal Frontiers in Neuroscience.

                Article
                10.3389/fnins.2014.00386
                4248808
                25520611
                Copyright © 2014 Bernstein and Liebenthal.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                Page count
                Figures: 1, Tables: 0, Equations: 0, References: 235, Pages: 18, Words: 18479
                Categories
                Psychology
                Review Article

                Comments

                Comment on this article