14
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Validation of the Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations (ADFES-BIV): A Set of Videos Expressing Low, Intermediate, and High Intensity Emotions

      research-article
      * , ,
      PLoS ONE
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Most of the existing sets of facial expressions of emotion contain static photographs. While increasing demand for stimuli with enhanced ecological validity in facial emotion recognition research has led to the development of video stimuli, these typically involve full-blown (apex) expressions. However, variations of intensity in emotional facial expressions occur in real life social interactions, with low intensity expressions of emotions frequently occurring. The current study therefore developed and validated a set of video stimuli portraying three levels of intensity of emotional expressions, from low to high intensity. The videos were adapted from the Amsterdam Dynamic Facial Expression Set (ADFES) and termed the Bath Intensity Variations (ADFES-BIV). A healthy sample of 92 people recruited from the University of Bath community (41 male, 51 female) completed a facial emotion recognition task including expressions of 6 basic emotions (anger, happiness, disgust, fear, surprise, sadness) and 3 complex emotions (contempt, embarrassment, pride) that were expressed at three different intensities of expression and neutral. Accuracy scores (raw and unbiased ( Hu) hit rates) were calculated, as well as response times. Accuracy rates above chance level of responding were found for all emotion categories, producing an overall raw hit rate of 69% for the ADFES-BIV. The three intensity levels were validated as distinct categories, with higher accuracies and faster responses to high intensity expressions than intermediate intensity expressions, which had higher accuracies and faster responses than low intensity expressions. To further validate the intensities, a second study with standardised display times was conducted replicating this pattern. The ADFES-BIV has greater ecological validity than many other emotion stimulus sets and allows for versatile applications in emotion research. It can be retrieved free of charge for research purposes from the corresponding author.

          Related collections

          Most cited references43

          • Record: found
          • Abstract: found
          • Article: not found

          Measuring emotion: the Self-Assessment Manikin and the Semantic Differential.

          The Self-Assessment Manikin (SAM) is a non-verbal pictorial assessment technique that directly measures the pleasure, arousal, and dominance associated with a person's affective reaction to a wide variety of stimuli. In this experiment, we compare reports of affective experience obtained using SAM, which requires only three simple judgments, to the Semantic Differential scale devised by Mehrabian and Russell (An approach to environmental psychology, 1974) which requires 18 different ratings. Subjective reports were measured to a series of pictures that varied in both affective valence and intensity. Correlations across the two rating methods were high both for reports of experienced pleasure and felt arousal. Differences obtained in the dominance dimension of the two instruments suggest that SAM may better track the personal response to an affective stimulus. SAM is an inexpensive, easy method for quickly assessing reports of affective response in many contexts.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            On the universality and cultural specificity of emotion recognition: a meta-analysis.

            A meta-analysis examined emotion recognition within and across cultures. Emotions were universally recognized at better-than-chance levels. Accuracy was higher when emotions were both expressed and recognized by members of the same national, ethnic, or regional group, suggesting an in-group advantage. This advantage was smaller for cultural groups with greater exposure to one another, measured in terms of living in the same nation, physical proximity, and telephone communication. Majority group members were poorer at judging minority group members than the reverse. Cross-cultural accuracy was lower in studies that used a balanced research design, and higher in studies that used imitation rather than posed or spontaneous emotional expressions. Attributes of study design appeared not to moderate the size of the in-group advantage.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Recognizing emotion from facial expressions: psychological and neurological mechanisms.

              Recognizing emotion from facial expressions draws on diverse psychological processes implemented in a large array of neural structures. Studies using evoked potentials, lesions, and functional imaging have begun to elucidate some of the mechanisms. Early perceptual processing of faces draws on cortices in occipital and temporal lobes that construct detailed representations from the configuration of facial features. Subsequent recognition requires a set of structures, including amygdala and orbitofrontal cortex, that links perceptual representations of the face to the generation of knowledge about the emotion signaled, a complex set of mechanisms using multiple strategies. Although recent studies have provided a wealth of detail regarding these mechanisms in the adult human brain, investigations are also being extended to nonhuman primates, to infants, and to patients with psychiatric disorders.
                Bookmark

                Author and article information

                Contributors
                Role: Editor
                Journal
                PLoS One
                PLoS ONE
                plos
                plosone
                PLoS ONE
                Public Library of Science (San Francisco, CA USA )
                1932-6203
                19 January 2016
                2016
                : 11
                : 1
                : e0147112
                Affiliations
                [001]Department of Psychology, University of Bath, Bath, United Kingdom
                University of Udine, ITALY
                Author notes

                Competing Interests: The authors have declared that no competing interests exist.

                Conceived and designed the experiments: TSHW MB CA. Performed the experiments: TSHW. Analyzed the data: TSHW. Wrote the paper: TSHW MB CA. Created the stimuli: TSHW. Interpreted the data: TSHW MB CA.

                Article
                PONE-D-15-32940
                10.1371/journal.pone.0147112
                4718603
                26784347
                37579d2e-5cca-4642-ab9c-13b96acd8546
                © 2016 Wingenbach et al

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 27 July 2015
                : 29 December 2015
                Page count
                Figures: 7, Tables: 3, Pages: 28
                Funding
                This work was supported by the Department of Psychology of the University of Bath and doctoral scholarships to TSHW, from the German Academic Exchange Service (DAAD), the FAZIT Stiftung, and the University of Bath Graduate School.
                Categories
                Research Article
                Custom metadata
                All relevant data are within the paper and its Supporting Information files.

                Uncategorized
                Uncategorized

                Comments

                Comment on this article