25
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Language, Tone and Prosody of Emotions: Neural Substrates and Dynamics of Spoken-Word Emotion Perception

      review-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Rapid assessment of emotions is important for detecting and prioritizing salient input. Emotions are conveyed in spoken words via verbal and non-verbal channels that are mutually informative and unveil in parallel over time, but the neural dynamics and interactions of these processes are not well understood. In this paper, we review the literature on emotion perception in faces, written words, and voices, as a basis for understanding the functional organization of emotion perception in spoken words. The characteristics of visual and auditory routes to the amygdala—a subcortical center for emotion perception—are compared across these stimulus classes in terms of neural dynamics, hemispheric lateralization, and functionality. Converging results from neuroimaging, electrophysiological, and lesion studies suggest the existence of an afferent route to the amygdala and primary visual cortex for fast and subliminal processing of coarse emotional face cues. We suggest that a fast route to the amygdala may also function for brief non-verbal vocalizations (e.g., laugh, cry), in which emotional category is conveyed effectively by voice tone and intensity. However, emotional prosody which evolves on longer time scales and is conveyed by fine-grained spectral cues appears to be processed via a slower, indirect cortical route. For verbal emotional content, the bulk of current evidence, indicating predominant left lateralization of the amygdala response and timing of emotional effects attributable to speeded lexical access, is more consistent with an indirect cortical route to the amygdala. Top-down linguistic modulation may play an important role for prioritized perception of emotions in words. Understanding the neural dynamics and interactions of emotion and language perception is important for selecting potent stimuli and devising effective training and/or treatment approaches for the alleviation of emotional dysfunction across a range of neuropsychiatric states.

          Related collections

          Most cited references156

          • Record: found
          • Abstract: not found
          • Article: not found

          The N1 wave of the human electric and magnetic response to sound: a review and an analysis of the component structure.

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Conscious and unconscious emotional learning in the human amygdala.

            If subjects are shown an angry face as a target visual stimulus for less than forty milliseconds and are then immediately shown an expressionless mask, these subjects report seeing the mask but not the target. However, an aversively conditioned masked target can elicit an emotional response from subjects without being consciously perceived. Here we study the mechanism of this unconsciously mediated emotional learning. We measured neural activity in volunteer subjects who were presented with two angry faces, one of which, through previous classical conditioning, was associated with a burst of white noise. In half of the trials, the subjects' awareness of the angry faces was prevented by backward masking with a neutral face. A significant neural response was elicited in the right, but not left, amygdala to masked presentations of the conditioned angry face. Unmasked presentations of the same face produced enhanced neural activity in the left, but not right, amygdala. Our results indicate that, first, the human amygdala can discriminate between stimuli solely on the basis of their acquired behavioural significance, and second, this response is lateralized according to the subjects' level of awareness of the stimuli.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Effects of attention and emotion on face processing in the human brain: an event-related fMRI study.

              We used event-related fMRI to assess whether brain responses to fearful versus neutral faces are modulated by spatial attention. Subjects performed a demanding matching task for pairs of stimuli at prespecified locations, in the presence of task-irrelevant stimuli at other locations. Faces or houses unpredictably appeared at the relevant or irrelevant locations, while the faces had either fearful or neutral expressions. Activation of fusiform gyri by faces was strongly affected by attentional condition, but the left amygdala response to fearful faces was not. Right fusiform activity was greater for fearful than neutral faces, independently of the attention effect on this region. These results reveal differential influences on face processing from attention and emotion, with the amygdala response to threat-related expressions unaffected by a manipulation of attention that strongly modulates the fusiform response to faces.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                08 November 2016
                2016
                : 10
                : 506
                Affiliations
                [1] 1Department of Psychiatry, Brigham and Women's Hospital Boston, MA, USA
                [2] 2Department of Radiology, Brigham and Women's Hospital Boston, MA, USA
                Author notes

                Edited by: Jonathan B. Fritz, The University of Maryland, College Park, USA

                Reviewed by: Dan Zhang, Tsinghua University, China; Iain DeWitt, National Institute of Deafness and Communication Disorders, USA

                *Correspondence: Einat Liebenthal eliebenthal@ 123456partners.org

                This article was submitted to Auditory Cognitive Neuroscience, a section of the journal Frontiers in Neuroscience

                Article
                10.3389/fnins.2016.00506
                5099784
                27877106
                5ded7c83-de32-4a52-96aa-5ad433f27473
                Copyright © 2016 Liebenthal, Silbersweig and Stern.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 26 January 2016
                : 24 October 2016
                Page count
                Figures: 1, Tables: 0, Equations: 0, References: 192, Pages: 13, Words: 12662
                Funding
                Funded by: Brain and Behavior Research Foundation 10.13039/100000874
                Award ID: 22249
                Funded by: National Institutes of Health 10.13039/100000002
                Award ID: NIDCD R01 DC006287
                Categories
                Neuroscience
                Review

                Neurosciences
                emotions,semantics,amygdala,word processing,fmri,erps (event-related potentials),speech perception,voice perception

                Comments

                Comment on this article