Blog
About

78
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Learning text representation using recurrent convolutional neural network with highway layers

      Preprint

      , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Recently, the rapid development of word embedding and neural networks has brought new inspiration to various NLP and IR tasks. In this paper, we describe a staged hybrid model combining Recurrent Convolutional Neural Networks (RCNN) with highway layers. The highway network module is incorporated in the middle takes the output of the bi-directional Recurrent Neural Network (Bi-RNN) module in the first stage and provides the Convolutional Neural Network (CNN) module in the last stage with the input. The experiment shows that our model outperforms common neural network models (CNN, RNN, Bi-RNN) on a sentiment analysis task. Besides, the analysis of how sequence length influences the RCNN with highway layers shows that our model could learn good representation for the long text.

          Related collections

          Author and article information

          Journal
          2016-06-22
          2016-08-02
          Article
          1606.06905

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          Custom metadata
          Neu-IR '16 SIGIR Workshop on Neural Information Retrieval
          cs.CL cs.IR

          Theoretical computer science, Information & Library science

          Comments

          Comment on this article