0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Novel Cascade Model for End-to-End Aspect-Based Social Comment Sentiment Analysis

      , , , ,
      Electronics
      MDPI AG

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The end-to-end aspect-based social comment sentiment analysis (E2E-ABSA) task aims to discover human’s fine-grained sentimental polarity, which can be refined to determine the attitude in response to an object revealed in a social user’s textual description. The E2E-ABSA problem includes two sub-tasks, i.e., opinion target extraction and target sentiment identification. However, most previous methods always tend to model these two tasks independently, which inevitably hinders the overall practical performance. This paper investigates the critical collaborative signals between these two sub-tasks and thus proposes a novel cascade social comment sentiment analysis model for jointly tackling the E2E-ABSA problem, namely CasNSA. Instead of treating the opinion target extraction and target sentiment identification as discrete procedures in previous works, our new framework takes the contextualized target semantic encoding into consideration to yield better sentimental polarity judgment. Additionally, extensive empirical results show that the proposed approach effectively achieves a 68.13% F1-score on SemEval-2014, 62.34% F1-Score on SemEval-2015, 56.40% F1-Score on SemEval-2016, and 50.05% F1-score on a Twitter dataset, which is higher than the existing approaches. Ablated experiments demonstrate that the CasNSA model substantially outperforms state-of-the-art methods, even when using fixed words embedding rather than pre-trained BERT fine tuning. Moreover, in-depth performance analysis on the social comment datasets further validates that our work gains superior performance and reliability effectively and efficiently in realistic scenarios.

          Related collections

          Most cited references55

          • Record: found
          • Abstract: not found
          • Conference Proceedings: not found

          Glove: Global Vectors for Word Representation

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Sentiment Analysis and Opinion Mining

            Bing Liu (2012)
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Distributed Representations of Words and Phrases and their Compositionality

              The recently introduced continuous Skip-gram model is an efficient method for learning high-quality distributed vector representations that capture a large number of precise syntactic and semantic word relationships. In this paper we present several extensions that improve both the quality of the vectors and the training speed. By subsampling of the frequent words we obtain significant speedup and also learn more regular word representations. We also describe a simple alternative to the hierarchical softmax called negative sampling. An inherent limitation of word representations is their indifference to word order and their inability to represent idiomatic phrases. For example, the meanings of "Canada" and "Air" cannot be easily combined to obtain "Air Canada". Motivated by this example, we present a simple method for finding phrases in text, and show that learning good vector representations for millions of phrases is possible.
                Bookmark

                Author and article information

                Contributors
                Journal
                ELECGJ
                Electronics
                Electronics
                MDPI AG
                2079-9292
                June 2022
                June 07 2022
                : 11
                : 12
                : 1810
                Article
                10.3390/electronics11121810
                e9e7ce2c-1e7b-4bd0-a530-9e1c97e81c42
                © 2022

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article