8
views
0
recommends
+1 Recommend
1 collections
    0
    shares

      Submit your digital health research with an established publisher
      - celebrating 25 years of open access

      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Automatic Assessment of Emotion Dysregulation in American, French, and Tunisian Adults and New Developments in Deep Multimodal Fusion: Cross-sectional Study

      research-article
      , PhD 1 , , , PhD 1 , , PhD 1
      (Reviewer), (Reviewer)
      JMIR Mental Health
      JMIR Publications
      emotion dysregulation, deep multimodal fusion, small data, psychometrics

      Read this article at

      ScienceOpenPublisherPMC
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Background

          Emotion dysregulation is a key dimension of adult psychological functioning. There is an interest in developing a computer-based, multimodal, and automatic measure.

          Objective

          We wanted to train a deep multimodal fusion model to estimate emotion dysregulation in adults based on their responses to the Multimodal Developmental Profile, a computer-based psychometric test, using only a small training sample and without transfer learning.

          Methods

          Two hundred and forty-eight participants from 3 different countries took the Multimodal Developmental Profile test, which exposed them to 14 picture and music stimuli and asked them to express their feelings about them, while the software extracted the following features from the video and audio signals: facial expressions, linguistic and paralinguistic characteristics of speech, head movements, gaze direction, and heart rate variability derivatives. Participants also responded to the brief version of the Difficulties in Emotional Regulation Scale. We separated and averaged the feature signals that corresponded to the responses to each stimulus, building a structured data set. We transformed each person’s per-stimulus structured data into a multimodal codex, a grayscale image created by projecting each feature’s normalized intensity value onto a cartesian space, deriving each pixel’s position by applying the Uniform Manifold Approximation and Projection method. The codex sequence was then fed to 2 network types. First, 13 convolutional neural networks dealt with the spatial aspect of the problem, estimating emotion dysregulation by analyzing each of the codified responses. These convolutional estimations were then fed to a transformer network that decoded the temporal aspect of the problem, estimating emotional dysregulation based on the succession of responses. We introduce a Feature Map Average Pooling layer, which computes the mean of the convolved feature maps produced by our convolution layers, dramatically reducing the number of learnable weights and increasing regularization through an ensembling effect. We implemented 8-fold cross-validation to provide a good enough estimation of the generalization ability to unseen samples. Most of the experiments mentioned in this paper are easily replicable using the associated Google Colab system.

          Results

          We found an average Pearson correlation ( r) of 0.55 (with an average P value of <.001) between ground truth emotion dysregulation and our system’s estimation of emotion dysregulation. An average mean absolute error of 0.16 and a mean concordance correlation coefficient of 0.54 were also found.

          Conclusions

          In psychometry, our results represent excellent evidence of convergence validity, suggesting that the Multimodal Developmental Profile could be used in conjunction with this methodology to provide a valid measure of emotion dysregulation in adults. Future studies should replicate our findings using a hold-out test sample. Our methodology could be implemented more generally to train deep neural networks where only small training samples are available.

          Related collections

          Most cited references41

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          UMAP: Uniform Manifold Approximation and Projection

            • Record: found
            • Abstract: not found
            • Article: not found

            Multidimensional Assessment of Emotion Regulation and Dysregulation: Development, Factor Structure, and Initial Validation of the Difficulties in Emotion Regulation Scale

              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              Xception: Deep Learning with Depthwise Separable Convolutions

                Author and article information

                Contributors
                Journal
                JMIR Ment Health
                JMIR Ment Health
                JMH
                JMIR Mental Health
                JMIR Publications (Toronto, Canada )
                2368-7959
                January 2022
                24 January 2022
                : 9
                : 1
                : e34333
                Affiliations
                [1 ] LE2I EA 7508 Université Bourgogne Franche-Comté Dijon France
                Author notes
                Corresponding Author: Federico Parra federico.parra@ 123456hotmail.com
                Author information
                https://orcid.org/0000-0002-5333-8211
                https://orcid.org/0000-0002-6034-7431
                https://orcid.org/0000-0002-6844-2521
                Article
                v9i1e34333
                10.2196/34333
                8822434
                35072643
                82a084b6-bf2b-4d29-822d-88654b902d81
                ©Federico Parra, Yannick Benezeth, Fan Yang. Originally published in JMIR Mental Health (https://mental.jmir.org), 24.01.2022.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Mental Health, is properly cited. The complete bibliographic information, a link to the original publication on https://mental.jmir.org/, as well as this copyright and license information must be included.

                History
                : 19 October 2021
                : 8 November 2021
                : 10 November 2021
                : 23 November 2021
                Categories
                Original Paper
                Original Paper

                emotion dysregulation,deep multimodal fusion,small data,psychometrics

                Comments

                Comment on this article

                Related Documents Log