38
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Emotional Cues during Simultaneous Face and Voice Processing: Electrophysiological Insights

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Both facial expression and tone of voice represent key signals of emotional communication but their brain processing correlates remain unclear. Accordingly, we constructed a novel implicit emotion recognition task consisting of simultaneously presented human faces and voices with neutral, happy, and angry valence, within the context of recognizing monkey faces and voices task. To investigate the temporal unfolding of the processing of affective information from human face-voice pairings, we recorded event-related potentials (ERPs) to these audiovisual test stimuli in 18 normal healthy subjects; N100, P200, N250, P300 components were observed at electrodes in the frontal-central region, while P100, N170, P270 were observed at electrodes in the parietal-occipital region. Results indicated a significant audiovisual stimulus effect on the amplitudes and latencies of components in frontal-central (P200, P300, and N250) but not the parietal occipital region (P100, N170 and P270). Specifically, P200 and P300 amplitudes were more positive for emotional relative to neutral audiovisual stimuli, irrespective of valence, whereas N250 amplitude was more negative for neutral relative to emotional stimuli. No differentiation was observed between angry and happy conditions. The results suggest that the general effect of emotion on audiovisual processing can emerge as early as 200 msec (P200 peak latency) post stimulus onset, in spite of implicit affective processing task demands, and that such effect is mainly distributed in the frontal-central region.

          Related collections

          Most cited references72

          • Record: found
          • Abstract: found
          • Article: not found

          Effects of attention and emotion on face processing in the human brain: an event-related fMRI study.

          We used event-related fMRI to assess whether brain responses to fearful versus neutral faces are modulated by spatial attention. Subjects performed a demanding matching task for pairs of stimuli at prespecified locations, in the presence of task-irrelevant stimuli at other locations. Faces or houses unpredictably appeared at the relevant or irrelevant locations, while the faces had either fearful or neutral expressions. Activation of fusiform gyri by faces was strongly affected by attentional condition, but the left amygdala response to fearful faces was not. Right fusiform activity was greater for fearful than neutral faces, independently of the attention effect on this region. These results reveal differential influences on face processing from attention and emotion, with the amygdala response to threat-related expressions unaffected by a manipulation of attention that strongly modulates the fusiform response to faces.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Auditory-visual integration during multimodal object recognition in humans: a behavioral and electrophysiological study.

            The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to characterize the processing stages and the neural structures where multisensory interactions take place. Event-related potentials (ERPs) were recorded from 30 scalp electrodes while subjects performed a forced-choice reaction-time categorization task: At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two objects were defined by auditory features alone, visual features alone, or the combination of auditory and visual features. Subjects were more accurate and rapid at identifying multimodal than unimodal objects. Spatiotemporal analysis of ERPs and scalp current densities revealed several auditory-visual interaction components temporally, spatially, and functionally distinct before 200 msec poststimulus. The effects observed were (1) in visual areas, new neural activities (as early as 40 msec poststimulus) and modulation (amplitude decrease) of the N185 wave to unimodal visual stimulus, (2) in the auditory cortex, modulation (amplitude increase) of subcomponents of the unimodal auditory N1 wave around 90 to 110 msec, and (3) new neural activity over the right fronto-temporal area (140 to 165 msec). Furthermore, when the subjects were separated into two groups according to their dominant modality to perform the task in unimodal conditions (shortest reaction time criteria), the integration effects were found to be similar for the two groups over the nonspecific fronto-temporal areas, but they clearly differed in the sensory-specific cortices, affecting predominantly the sensory areas of the nondominant modality. Taken together, the results indicate that multisensory integration is mediated by flexible, highly adaptive physiological processes that can take place very early in the sensory processing chain and operate in both sensory-specific and nonspecific cortical structures in different ways.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              A review of the evidence for P2 being an independent component process: age, sleep and modality.

              This article reviews the event-related potential (ERP) literature in relation to the P2 waveform of the human auditory evoked potential. Within the auditory evoked potential, a positive deflection at approximately 150-250 ms is a ubiquitous feature. Unlike other cognitive components such as N1 or the P300, remarkably little has been done to investigate the underlying neurological correlates or significance of this waveform. Indeed until recently, many researchers considered it to be an intrinsic part of the 'vertex potential' complex, involving it and the earlier N1. This review seeks to describe the evidence supportive of P2 being the result of independent processes and highlights several features, such as its persistence from wakefulness into sleep, the general consensus that unlike most other EEG phenomena it increases with age, and the fact that it can be generated using respiratory stimuli.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, USA )
                1932-6203
                2012
                22 February 2012
                : 7
                : 2
                : e31001
                Affiliations
                [1 ]Department of Psychology, Second Military Medical University, Shanghai, China
                [2 ]Clinical Neuroscience Division, Laboratory of Neuroscience, Department of Psychiatry, Boston VA Healthcare System, Brockton Division and Harvard Medical School, Brockton, Massachusetts, United States of America
                [3 ]Neuropsychophysiology Laboratory, CiPsi, School of Psychology, University of Minho, Braga, Portugal
                [4 ]Department of Neurology, Neuroscience Research Center of Changzheng Hospital, Second Military Medical University, Shanghai, China
                [5 ]University of Massachusetts, Boston, Massachusetts, United States of America
                Cuban Neuroscience Center, Cuba
                Author notes

                Conceived and designed the experiments: TSL MAN ZXZ. Performed the experiments: TSL AP. Analyzed the data: TSL AP MAN. Contributed reagents/materials/analysis tools: RWM PGN. Wrote the paper: TSL AP MAN PGN.

                Article
                PONE-D-11-15537
                10.1371/journal.pone.0031001
                3285164
                22383987
                40c2cdce-946a-49d6-b338-9c8f2ca75ee3
                This is an open-access article, free of all copyright, and may be freely reproduced, distributed, transmitted, modified, built upon, or otherwise used by anyone for any lawful purpose. The work is made available under the Creative Commons CC0 public domain dedication.
                History
                : 4 August 2011
                : 28 December 2011
                Page count
                Pages: 10
                Categories
                Research Article
                Biology
                Neuroscience
                Sensory Perception
                Sensory Systems
                Medicine
                Mental Health
                Psychology
                Behavior
                Social and Behavioral Sciences
                Psychology
                Behavior

                Uncategorized
                Uncategorized

                Comments

                Comment on this article