1
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      QoE of cross-modally mapped Mulsemedia: an assessment using eye gaze and heart rate

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          A great deal of research effort has been put in exploring crossmodal correspondences in the field of cognitive science which refer to the systematic associations frequently made between different sensory modalities (e.g. high pitch is matched with angular shapes). However, the possibilities cross-modality opens in the digital world have been relatively unexplored. Therefore, we consider that studying the plasticity and the effects of crossmodal correspondences in a mulsemedia setup can bring novel insights about improving the human-computer dialogue and experience. Mulsemedia refers to the combination of three or more senses to create immersive experiences. In our experiments, users were shown six video clips associated with certain visual features based on color, brightness, and shape. We examined if the pairing with crossmodal matching sound and the corresponding auto-generated haptic effect, and smell would lead to an enhanced user QoE. For this, we used an eye-tracking device as well as a heart rate monitor wristband to capture users’ eye gaze and heart rate whilst they were experiencing mulsemedia. After each video clip, we asked the users to complete an on-screen questionnaire with a set of questions related to smell, sound and haptic effects targeting their enjoyment and perception of the experiment. Accordingly, the eye gaze and heart rate results showed significant influence of the cross-modally mapped multisensorial effects on the users’ QoE. Our results highlight that when the olfactory content is crossmodally congruent with the visual content, the visual attention of the users seems shifted towards the correspondent visual feature. Crosmodally matched media is also shown to result in an enhanced QoE compared to a video only condition.

          Related collections

          Most cited references 61

          • Record: found
          • Abstract: found
          • Article: not found

          DEAP: A Database for Emotion Analysis ;Using Physiological Signals

          IEEE Transactions on Affective Computing, 3(1), 18-31
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            On cross-modal similarity: auditory-visual interactions in speeded discrimination.

            A series of four experiments explored how cross-modal similarities between sensory attributes in vision and hearing reveal themselves in speeded, two-stimulus discrimination. When subjects responded differentially to stimuli on one modality, speed and accuracy of response were greater on trials accompanied by informationally irrelevant "matching" versus "mismatching" stimuli from the other modality. Cross-modal interactions appeared in (a) responses to dim/bright lights and to dark/light colors accompanied by low-pitched/high-pitched tones; (b) responses to low-pitched/high-pitched tones accompanied by dim/bright lights or by dark/light colors; (c) responses to dim/bright lights, but not to dark/light colors, accompanied by soft/loud sounds; and (d) responses to rounded/sharp forms accompanied by low-pitched/high-pitched tones. These results concur with findings on cross-modal perception, synesthesia, and synesthetic metaphor, which reveal similarities between pitch and brightness, pitch and lightness, loudness and brightness, and pitch and form. The cross-modal interactions in response speed and accuracy may take place at a sensory/perceptual level of processing or after sensory stimuli are encoded semantically.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Why we are not all synesthetes (not even weakly so).

              A little over a decade ago, Martino and Marks (Current Directions in Psychological Science 10:61-65, 2001) put forward the influential claim that cases of intuitive matchings between stimuli in different sensory modalities should be considered as a weak form of synesthesia. Over the intervening years, many other researchers have agreed-at the very least, implicitly-with this position (e.g., Bien, ten Oever, Goebel, & Sack NeuroImage 59:663-672, 2012; Eagleman Cortex 45:1266-1277, 2009; Esterman, Verstynen, Ivry, & Robertson Journal of Cognitive Neuroscience 18:1570-1576, 2006; Ludwig, Adachi, & Matzuzawa Proceedings of the National Academy of Sciences of the United States of America 108:20661-20665, 2011; Mulvenna & Walsh Trends in Cognitive Sciences 10:350-352, 2006; Sagiv & Ward 2006; Zellner, McGarry, Mattern-McClory, & Abreu Chemical Senses 33:211-222:2008). Here, though, we defend the separatist view, arguing that these cases are likely to form distinct kinds of phenomena despite their superficial similarities. We believe that crossmodal correspondences should be studied in their own right and not assimilated, either in terms of the name used or in terms of the explanation given, to synesthesia. To conflate these two phenomena is both inappropriate and potentially misleading. Below, we critically evaluate the evidence concerning the descriptive and constitutive features of crossmodal correspondences and synesthesia and highlight how they differ. Ultimately, we wish to provide a general definition of crossmodal correspondences as acquired, malleable, relative, and transitive pairings between sensory dimensions and to provide a framework in which to integrate the nonsystematic cataloguing of new cases of crossmodal correspondences, a tendency that has increased in recent years.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                Journal
                Multimedia Tools and Applications
                Multimed Tools Appl
                Springer Science and Business Media LLC
                1380-7501
                1573-7721
                March 2020
                January 03 2020
                March 2020
                : 79
                : 11-12
                : 7987-8009
                Article
                10.1007/s11042-019-08473-5
                © 2020

                Comments

                Comment on this article