377
views
2
recommends
+1 Recommend
1 collections
    3
    shares
      scite_
       
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Sentence-Level Emotion Apprehension Through Facial Expression & Speech Verification Analysis

      Preprint
      In review
      research-article
        1 , ,   1
      1 (Illustrator), 1 (Research-assistant), 1 (Research-assistant)
      ScienceOpen Preprints
      ScienceOpen
      Sentiment Recognition, Image Processing, Speech Recognition, CNN, NLTK, NLP, Speech to Text, CMU-MOSEI, SoftMax, Emotion Intensity
      Bookmark

            Abstract

            The importance of Emotional state apprehension is widely perceived in social interaction and social intelligence. Since the nineteenth century, this has been a popular research subject. In human-to-human communication, the understanding of facial expressions forms a communication carrier that offers vital data about the mental, emotional and even physical state of the persons in conversation. Inevitably user's emotional state plays an important role not only in human associations with other people but also in the way a user uses computers. As the emotional state of a person may determine consistency, task solving, and decision-making skills. Facial expression analysis, as used in this research, refers to computer systems that try to automatically predict user emotional state by analyzing and identifying facial motions and facial feature changes from visual data. Though situations, body gestures, voice, individual diversity, and cultural influences, as well as facial arrangement and timing, all aid in interpretation. Facial expression analysis tools will be used in this research to analyze facial actions regardless of context, society, gender, and so on.

            Content

            Author and article information

            Journal
            ScienceOpen Preprints
            ScienceOpen
            28 March 2022
            Affiliations
            [1 ] Computer Science & Engineering, American International University,Bangladesh, 408/1, Kuratoli, Dhaka 1229
            Author notes
            Author information
            https://orcid.org/0000-0003-2753-446X
            https://orcid.org/0000-0002-5679-7317
            https://orcid.org/0000-0002-1374-1161
            Article
            10.14293/S2199-1006.1.SOR-.PPMPPJP.v1
            c5685f95-01ed-4e48-82be-d0eae8926a6d

            This work has been published open access under Creative Commons Attribution License CC BY 4.0 , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Conditions, terms of use and publishing policy can be found at www.scienceopen.com .

            History
            : 28 March 2022

            The data that support the findings of this study are available from http://multicomp.cs.cmu.edu/resources/cmu-mosei-dataset but restrictions apply to the availability of these data, which were used under license for the current study, and so are not publicly available. Data are however available from the authors upon reasonable request and with permission of http://multicomp.cs.cmu.edu/resources/cmu-mosei-dataset.
            Artificial intelligence
            Sentiment Recognition,Image Processing,Speech Recognition,CNN,NLTK,NLP,Speech to Text,CMU-MOSEI,SoftMax,Emotion Intensity

            References

            1. Samson Andrea C., Kreibig Sylvia D., Soderstrom Blake, Wade A. Ayanna, Gross James J.. Eliciting positive, negative and mixed emotional states: A film library for affective scientists. Cognition and Emotion. Vol. 30(5):827–856. 2016. Informa UK Limited. [Cross Ref]

            2. de Gelder Beatrice. Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Philosophical Transactions of the Royal Society B: Biological Sciences. Vol. 364(1535):3475–3484. 2009. The Royal Society. [Cross Ref]

            3. Das Dipankar, Bandyopadhyay Sivaji. Sentence level emotion tagging. 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops. 2009. IEEE. [Cross Ref]

            4. Satrio Utomo Tegar, Sarno Riyanarto, Suhariyanto. Emotion Label from ANEW Dataset for Searching Best Definition from WordNet. 2018 International Seminar on Application for Technology of Information and Communication. 2018. IEEE. [Cross Ref]

            5. Landauer Thomas K., Dumais Susan T.. A solution to Plato's problem: The latent semantic analysis theory of acquisition, induction, and representation of knowledge. Psychological Review. Vol. 104(2):211–240. 1997. American Psychological Association (APA). [Cross Ref]

            6. Alshari Eissa M., Azman Azreen, Doraisamy Shyamala, Mustapha Norwati, Alkeshr Mostafa. Effective Method for Sentiment Lexical Dictionary Enrichment Based on Word2Vec for Sentiment Analysis. 2018 Fourth International Conference on Information Retrieval and Knowledge Management (CAMP). 2018. IEEE. [Cross Ref]

            7. Asghar Muhammad Zubair, Khan Aurangzeb, Bibi Afsana, Kundi Fazal Masud, Ahmad Hussain. Sentence-Level Emotion Detection Framework Using Rule-Based Classification. Cognitive Computation. Vol. 9(6):868–894. 2017. Springer Science and Business Media LLC. [Cross Ref]

            8. Strapparava Carlo, Mihalcea Rada. SemEval-2007 task 14. Proceedings of the 4th International Workshop on Semantic Evaluations - SemEval '07. 2007. Association for Computational Linguistics. [Cross Ref]

            9. Dong Rida, Peng Oinke, Li Xintong, Guan Xinyu. CNN-SVM with Embedded Recurrent Structure for Social Emotion Prediction. 2018 Chinese Automation Congress (CAC). 2018. IEEE. [Cross Ref]

            10. Strapparava Carlo, Mihalcea Rada. Learning to identify emotions in text. Proceedings of the 2008 ACM symposium on Applied computing - SAC '08. 2008. ACM Press. [Cross Ref]

            11. Shaheen Shadi, El-Hajj Wassim, Hajj Hazem, Elbassuoni Shady. Emotion Recognition from Text Based on Automatically Generated Rules. 2014 IEEE International Conference on Data Mining Workshop. 2014. IEEE. [Cross Ref]

            12. Özerdem Mehmet Siraç, Polat Hasan. Emotion recognition based on EEG features in movie clips with channel selection. Brain Informatics. Vol. 4(4):241–252. 2017. Springer Science and Business Media LLC. [Cross Ref]

            13. Zhang Hongli, Jolfaei Alireza, Alazab Mamoun. A Face Emotion Recognition Method Using Convolutional Neural Network and Image Edge Computing. IEEE Access. Vol. 7:159081–159089. 2019. Institute of Electrical and Electronics Engineers (IEEE). [Cross Ref]

            14. Ma Hui, Celik Turgay. FER‐Net: facial expression recognition using densely connected convolutional network. Electronics Letters. Vol. 55(4):184–186. 2019. Institution of Engineering and Technology (IET). [Cross Ref]

            15. Savchenko A. V.. Deep neural networks and maximum likelihood search for approximate nearest neighbor in video-based image recognition. Optical Memory and Neural Networks. Vol. 26(2):129–136. 2017. Allerton Press. [Cross Ref]

            16. Salah Albert Ali, Kaya Heysem, Gürpınar Furkan. Video-based emotion recognition in the wildMultimodal Behavior Analysis in the Wild. p. 369–386. 2019. Elsevier. [Cross Ref]

            Comments

            Comment on this article