13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Automatic detection and classification of emotional states in virtual reality and standard environments (LCD): comparing valence and arousal of induced emotions

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The following case study was carried out on a sample of one experimental and one control group. The participants of the experimental group watched the movie section from the standardized LATEMO-E database via virtual reality (VR) on Oculus Rift S and HTC Vive Pro devices. In the control group, the movie section was displayed on the LCD monitor. The movie section was categorized according to Ekman's and Russell's classification model of evoking an emotional state. The range of valence and arousal was determined in both observed groups. Valence and arousal were measured in each group using a Self-Assessment Manikin (SAM). The control group was captured by a camera and evaluated by Affdex software from Affectiva in order to compare valence values. The control group showed a very high correlation (0.92) between SAM and Affdex results. Having considered the Affdex results as a reference value, it can be concluded that SAM participants evaluated their emotions objectively. The results from both groups show that the movie section is supposed to evoke negative emotion. Negative emotion was perceived more intensely than its counterpart, positive emotion. Using virtual reality to evoke negative emotion (anger) has confirmed that VR triggers a significantly stronger intensity of emotion than LCD.

          Related collections

          Most cited references58

          • Record: found
          • Abstract: not found
          • Article: not found

          Measuring emotion: The self-assessment manikin and the semantic differential

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            What are emotions? And how can they be measured?

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The Nencki Affective Picture System (NAPS): Introduction to a novel, standardized, wide-range, high-quality, realistic picture database

              Selecting appropriate stimuli to induce emotional states is essential in affective research. Only a few standardized affective stimulus databases have been created for auditory, language, and visual materials. Numerous studies have extensively employed these databases using both behavioral and neuroimaging methods. However, some limitations of the existing databases have recently been reported, including limited numbers of stimuli in specific categories or poor picture quality of the visual stimuli. In the present article, we introduce the Nencki Affective Picture System (NAPS), which consists of 1,356 realistic, high-quality photographs that are divided into five categories (people, faces, animals, objects, and landscapes). Affective ratings were collected from 204 mostly European participants. The pictures were rated according to the valence, arousal, and approach–avoidance dimensions using computerized bipolar semantic slider scales. Normative ratings for the categories are presented for each dimension. Validation of the ratings was obtained by comparing them to ratings generated using the Self-Assessment Manikin and the International Affective Picture System. In addition, the physical properties of the photographs are reported, including luminance, contrast, and entropy. The new database, with accompanying ratings and image parameters, allows researchers to select a variety of visual stimulus materials specific to their experimental questions of interest. The NAPS system is freely accessible to the scientific community for noncommercial use by request at http://naps.nencki.gov.pl. Electronic supplementary material The online version of this article (doi:10.3758/s13428-013-0379-1) contains supplementary material, which is available to authorized users.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Virtual Reality
                Virtual Reality
                Springer Science and Business Media LLC
                1359-4338
                1434-9957
                December 2021
                March 09 2021
                December 2021
                : 25
                : 4
                : 1029-1041
                Article
                10.1007/s10055-021-00506-5
                b749f44a-045c-413d-abc2-8348a59a28cf
                © 2021

                https://creativecommons.org/licenses/by/4.0

                https://creativecommons.org/licenses/by/4.0

                History

                Comments

                Comment on this article