47
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Neural Network Methods for Natural Language Processing

      Synthesis Lectures on Human Language Technologies
      Morgan & Claypool Publishers LLC

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Related collections

          Most cited references63

          • Record: found
          • Abstract: not found
          • Article: not found

          Multilayer feedforward networks are universal approximators

            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            LSTM: A Search Space Odyssey

            Several variants of the long short-term memory (LSTM) architecture for recurrent neural networks have been proposed since its inception in 1995. In recent years, these networks have become the state-of-the-art models for a variety of machine learning problems. This has led to a renewed interest in understanding the role and utility of various computational components of typical LSTM variants. In this paper, we present the first large-scale analysis of eight LSTM variants on three representative tasks: speech recognition, handwriting recognition, and polyphonic music modeling. The hyperparameters of all LSTM variants for each task were optimized separately using random search, and their importance was assessed using the powerful functional ANalysis Of VAriance framework. In total, we summarize the results of 5400 experimental runs ( ≈ 15 years of CPU time), which makes our study the largest of its kind on LSTM networks. Our results show that none of the variants can improve upon the standard LSTM architecture significantly, and demonstrate the forget gate and the output activation function to be its most critical components. We further observe that the studied hyperparameters are virtually independent and derive guidelines for their efficient adjustment.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Backpropagation through time: what it does and how to do it

                Bookmark

                Author and article information

                Journal
                Synthesis Lectures on Human Language Technologies
                Synthesis Lectures on Human Language Technologies
                Morgan & Claypool Publishers LLC
                1947-4040
                1947-4059
                April 17 2017
                April 17 2017
                : 10
                : 1
                : 1-309
                Article
                10.2200/S00762ED1V01Y201703HLT037
                a2575f66-8191-4ae2-b86a-095ce8003137
                © 2017
                History

                Comments

                Comment on this article