17
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

      1 , 2 , 3
      Transactions of the Association for Computational Linguistics
      MIT Press - Journals

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references9

          • Record: found
          • Abstract: found
          • Article: not found

          Learning and development in neural networks: the importance of starting small.

          It is a striking fact that in humans the greatest learning occurs precisely at that point in time--childhood--when the most dramatic maturational changes also occur. This report describes possible synergistic interactions between maturational change and the ability to learn a complex domain (language), as investigated in connectionist networks. The networks are trained to process complex sentences involving relative clauses, number agreement, and several types of verb argument structure. Training fails in the case of networks which are fully formed and 'adultlike' in their capacity. Training succeeds only when networks begin with limited working memory and gradually 'mature' to the adult state. This result suggests that rather than being a limitation, developmental restrictions on resources may constitute a necessary prerequisite for mastering certain complex domains. Specifically, successful learning may depend on starting small.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Structures, Not Strings: Linguistics as Part of the Cognitive Sciences.

            There are many questions one can ask about human language: its distinctive properties, neural representation, characteristic uses including use in communicative contexts, variation, growth in the individual, and origin. Every such inquiry is guided by some concept of what 'language' is. Sharpening the core question--what is language?--and paying close attention to the basic property of the language faculty and its biological foundations makes it clear how linguistics is firmly positioned within the cognitive sciences. Here we will show how recent developments in generative grammar, taking language as a computational cognitive mechanism seriously, allow us to address issues left unexplained in the increasingly popular surface-oriented approaches to language.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Simple and Accurate Dependency Parsing Using Bidirectional LSTM Feature Representations

                Bookmark

                Author and article information

                Journal
                Transactions of the Association for Computational Linguistics
                Transactions of the Association for Computational Linguistics
                MIT Press - Journals
                2307-387X
                December 2016
                December 2016
                : 4
                : 521-535
                Affiliations
                [1 ]LSCP & IJN, CNRS, EHESS and ENS, PSL Research University,
                [2 ]LSCP, CNRS, EHESS and ENS, PSL Research University,
                [3 ]Computer Science Department, Bar Ilan University,
                Article
                10.1162/tacl_a_00115
                9bcdc6a6-3b60-4f0f-b58b-c76c06ce6186
                © 2016
                History

                Comments

                Comment on this article