2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Gated Orthogonal Recurrent Units: On Learning to Forget

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We present a novel recurrent neural network (RNN) architecture that combines the remembering ability of unitary RNNs with the ability of gated RNNs to effectively forget redundant information in the input sequence. We achieve this by extending Unitary RNNs with a gating mechanism. Our model is able to outperform LSTMs, GRUs and Unitary RNNs on different benchmark tasks, as the ability to simultaneously remember long term dependencies and forget irrelevant information in the input sequence helps with many natural long term sequential tasks such as language modeling and question answering. We provide competitive results along with an analysis of our model on the bAbI Question Answering task, PennTreeBank, as well as synthetic tasks that involve long-term dependencies such as parenthesis, denoising and copying tasks.

          Related collections

          Author and article information

          Journal
          2017-06-08
          Article
          1706.02761
          30ce023d-fc7d-4cf3-b642-a9f78eb5c46a

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          9 pages, 8 figures
          cs.LG cs.NE stat.ML

          Machine learning,Neural & Evolutionary computing,Artificial intelligence
          Machine learning, Neural & Evolutionary computing, Artificial intelligence

          Comments

          Comment on this article