2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Professional actors demonstrate variability, not stereotypical expressions, when portraying emotional states in photographs

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          It is long hypothesized that there is a reliable, specific mapping between certain emotional states and the facial movements that express those states. This hypothesis is often tested by asking untrained participants to pose the facial movements they believe they use to express emotions during generic scenarios. Here, we test this hypothesis using, as stimuli, photographs of facial configurations posed by professional actors in response to contextually-rich scenarios. The scenarios portrayed in the photographs were rated by a convenience sample of participants for the extent to which they evoked an instance of 13 emotion categories, and actors’ facial poses were coded for their specific movements. Both unsupervised and supervised machine learning find that in these photographs, the actors portrayed emotional states with variable facial configurations; instances of only three emotion categories (fear, happiness, and surprise) were portrayed with moderate reliability and specificity. The photographs were separately rated by another sample of participants for the extent to which they portrayed an instance of the 13 emotion categories; they were rated when presented alone and when presented with their associated scenarios, revealing that emotion inferences by participants also vary in a context-sensitive manner. Together, these findings suggest that facial movements and perceptions of emotion vary by situation and transcend stereotypes of emotional expressions. Future research may build on these findings by incorporating dynamic stimuli rather than photographs and studying a broader range of cultural contexts.

          Abstract

          It has long been hypothesized that certain emotional states are universally expressed with specific facial movements. Here the authors provide evidence that facial expressions of those emotional states are, in fact, varied among individuals rather than stereotyped.

          Related collections

          Most cited references80

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          SciPy 1.0: fundamental algorithms for scientific computing in Python

          SciPy is an open-source scientific computing library for the Python programming language. Since its initial release in 2001, SciPy has become a de facto standard for leveraging scientific algorithms in Python, with over 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories and millions of downloads per year. In this work, we provide an overview of the capabilities and development practices of SciPy 1.0 and highlight some recent technical developments.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Bayesian t tests for accepting and rejecting the null hypothesis.

            Progress in science often comes from discovering invariances in relationships among variables; these invariances often correspond to null hypotheses. As is commonly known, it is not possible to state evidence for the null hypothesis in conventional significance testing. Here we highlight a Bayes factor alternative to the conventional t test that will allow researchers to express preference for either the null hypothesis or the alternative. The Bayes factor has a natural and straightforward interpretation, is based on reasonable assumptions, and has better properties than other methods of inference that have been advocated in the psychological literature. To facilitate use of the Bayes factor, we provide an easy-to-use, Web-based program that performs the necessary calculations.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The NimStim set of facial expressions: judgments from untrained research participants.

              A set of face stimuli called the NimStim Set of Facial Expressions is described. The goal in creating this set was to provide facial expressions that untrained individuals, characteristic of research participants, would recognize. This set is large in number, multiracial, and available to the scientific community online. The results of psychometric evaluations of these stimuli are presented. The results lend empirical support for the validity and reliability of this set of facial expressions as determined by accurate identification of expressions and high intra-participant agreement across two testing sessions, respectively.
                Bookmark

                Author and article information

                Contributors
                l.barrett@northeastern.edu
                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group UK (London )
                2041-1723
                19 August 2021
                19 August 2021
                2021
                : 12
                : 5037
                Affiliations
                [1 ]GRID grid.116068.8, ISNI 0000 0001 2341 2786, Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, ; Cambridge, MA USA
                [2 ]GRID grid.418742.c, ISNI 0000 0004 0470 8006, Institute for High Performance Computing, Social and Cognitive Computing, ; Connexis North, Singapore
                [3 ]GRID grid.5596.f, ISNI 0000 0001 0668 7884, Department of Psychology, Katholieke Universiteit Leuven, ; Leuven, Belgium
                [4 ]GRID grid.25879.31, ISNI 0000 0004 1936 8972, Department of Neurology, University of Pennsylvania, ; Philadelphia, PA USA
                [5 ]GRID grid.266686.a, ISNI 0000000102217463, Department of Psychology, University of Massachusetts at Dartmouth, ; Dartmouth, MA 02747 USA
                [6 ]GRID grid.47100.32, ISNI 0000000419368710, Department of Psychology, Yale University, ; New Haven, CT USA
                [7 ]GRID grid.261112.7, ISNI 0000 0001 2173 3359, Department of Psychology, Northeastern University, ; Boston, MA USA
                [8 ]GRID grid.32224.35, ISNI 0000 0004 0386 9924, Massachusetts General Hospital/Martinos Center for Biomedical Imaging, ; Charlestown, MA USA
                Author information
                http://orcid.org/0000-0002-9938-7676
                http://orcid.org/0000-0003-0831-4234
                http://orcid.org/0000-0003-2668-7819
                http://orcid.org/0000-0003-4478-2051
                Article
                25352
                10.1038/s41467-021-25352-6
                8376986
                34413313
                189e001f-06a6-4dc8-8950-9a88122ec84d
                © The Author(s) 2021

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 18 December 2019
                : 2 August 2021
                Funding
                Funded by: FundRef https://doi.org/10.13039/100000050, U.S. Department of Health & Human Services | NIH | National Heart, Lung, and Blood Institute (NHLBI);
                Award ID: F31 HL140943-01
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000057, U.S. Department of Health & Human Services | NIH | National Institute of General Medical Sciences (NIGMS);
                Award ID: GM118629
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100007457, JPB Foundation;
                Funded by: FundRef https://doi.org/10.13039/100000025, U.S. Department of Health & Human Services | NIH | National Institute of Mental Health (NIMH);
                Award ID: 5 F32 MH105052
                Award ID: R01-MH113234
                Award ID: R01-MH109464
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100009919, United States Department of Defense | United States Army | Army Research Institute for the Behavioral and Social Sciences (U.S. Army Research Institute for the Behavioral & Social Sciences);
                Award ID: W911NF-16-1-019
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000054, U.S. Department of Health & Human Services | NIH | National Cancer Institute (NCI);
                Award ID: U01-CA193632
                Award Recipient :
                Funded by: U.S. Department of Health & Human Services | NIH | National Institute of Mental Health (NIMH)
                Funded by: U.S. Department of Health & Human Services | NIH | National Institute of Mental Health (NIMH)
                Funded by: FundRef https://doi.org/10.13039/100000001, National Science Foundation (NSF);
                Award ID: Civil, Mechanical, and Manufacturing Innovation Grant 1638234
                Award Recipient :
                Funded by: FundRef https://doi.org/10.13039/100000053, U.S. Department of Health & Human Services | NIH | National Eye Institute (NEI);
                Award ID: R01-EY020834
                Award Recipient :
                Categories
                Article
                Custom metadata
                © The Author(s) 2021

                Uncategorized
                communication,human behaviour
                Uncategorized
                communication, human behaviour

                Comments

                Comment on this article