1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Do all facial emojis communicate emotion? The impact of facial emojis on perceived sender emotion and text processing

      , ,
      Computers in Human Behavior
      Elsevier BV

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references35

          • Record: found
          • Abstract: found
          • Article: not found

          Thirty years and counting: finding meaning in the N400 component of the event-related brain potential (ERP).

          We review the discovery, characterization, and evolving use of the N400, an event-related brain potential response linked to meaning processing. We describe the elicitation of N400s by an impressive range of stimulus types--including written, spoken, and signed words or pseudowords; drawings, photos, and videos of faces, objects, and actions; sounds; and mathematical symbols--and outline the sensitivity of N400 amplitude (as its latency is remarkably constant) to linguistic and nonlinguistic manipulations. We emphasize the effectiveness of the N400 as a dependent variable for examining almost every aspect of language processing and highlight its expanding use to probe semantic memory and to determine how the neurocognitive system dynamically and flexibly uses bottom-up and top-down information to make sense of the world. We conclude with different theories of the N400's functional significance and offer an N400-inspired reconceptualization of how meaning processing might unfold.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            How to get statistically significant effects in any ERP experiment (and why you shouldn't).

            ERP experiments generate massive datasets, often containing thousands of values for each participant, even after averaging. The richness of these datasets can be very useful in testing sophisticated hypotheses, but this richness also creates many opportunities to obtain effects that are statistically significant but do not reflect true differences among groups or conditions (bogus effects). The purpose of this paper is to demonstrate how common and seemingly innocuous methods for quantifying and analyzing ERP effects can lead to very high rates of significant but bogus effects, with the likelihood of obtaining at least one such bogus effect exceeding 50% in many experiments. We focus on two specific problems: using the grand-averaged data to select the time windows and electrode sites for quantifying component amplitudes and latencies, and using one or more multifactor statistical analyses. Reanalyses of prior data and simulations of typical experimental designs are used to show how these problems can greatly increase the likelihood of significant but bogus results. Several strategies are described for avoiding these problems and for increasing the likelihood that significant effects actually reflect true differences among groups or conditions.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Emotional Expressions Reconsidered: Challenges to Inferring Emotion From Human Facial Movements

              It is commonly assumed that a person’s emotional state can be readily inferred from his or her facial movements, typically called emotional expressions or facial expressions. This assumption influences legal judgments, policy decisions, national security protocols, and educational practices; guides the diagnosis and treatment of psychiatric illness, as well as the development of commercial applications; and pervades everyday social interactions as well as research in other scientific fields such as artificial intelligence, neuroscience, and computer vision. In this article, we survey examples of this widespread assumption, which we refer to as the common view, and we then examine the scientific evidence that tests this view, focusing on the six most popular emotion categories used by consumers of emotion research: anger, disgust, fear, happiness, sadness, and surprise. The available scientific evidence suggests that people do sometimes smile when happy, frown when sad, scowl when angry, and so on, as proposed by the common view, more than what would be expected by chance. Yet how people communicate anger, disgust, fear, happiness, sadness, and surprise varies substantially across cultures, situations, and even across people within a single situation. Furthermore, similar configurations of facial movements variably express instances of more than one emotion category. In fact, a given configuration of facial movements, such as a scowl, often communicates something other than an emotional state. Scientists agree that facial movements convey a range of information and are important for social communication, emotional or otherwise. But our review suggests an urgent need for research that examines how people actually move their faces to express emotions and other social information in the variety of contexts that make up everyday life, as well as careful study of the mechanisms by which people perceive instances of emotion in one another. We make specific research recommendations that will yield a more valid picture of how people move their faces to express emotions and how they infer emotional meaning from facial movements in situations of everyday life. This research is crucial to provide consumers of emotion research with the translational information they require.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Computers in Human Behavior
                Computers in Human Behavior
                Elsevier BV
                07475632
                January 2022
                January 2022
                : 126
                : 107016
                Article
                10.1016/j.chb.2021.107016
                13ba0331-6a1b-459b-b39d-841eba556755
                © 2022

                https://www.elsevier.com/tdm/userlicense/1.0/

                History

                Comments

                Comment on this article