Blog
About

  • Record: found
  • Abstract: found
  • Article: not found

Advances in natural language processing.

1 , 2

Science (New York, N.Y.)

Read this article at

ScienceOpenPublisherPubMed
Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Abstract

      Natural language processing employs computational techniques for the purpose of learning, understanding, and producing human language content. Early computational approaches to language research focused on automating the analysis of the linguistic structure of language and developing basic technologies such as machine translation, speech recognition, and speech synthesis. Today's researchers refine and make use of such tools in real-world applications, creating spoken dialogue systems and speech-to-speech translation engines, mining social media for information about health or finance, and identifying sentiment and emotion toward products and services. We describe successes and challenges in this rapidly advancing area.

      Related collections

      Most cited references 15

      • Record: found
      • Abstract: not found
      • Article: not found

      Gene ontology: tool for the unification of biology. The Gene Ontology Consortium.

        Bookmark
        • Record: found
        • Abstract: not found
        • Article: not found

        The Psychological Meaning of Words: LIWC and Computerized Text Analysis Methods

          Bookmark
          • Record: found
          • Abstract: found
          • Article: not found

          Expectation-based syntactic comprehension.

           Roger Levy (2008)
          This paper investigates the role of resource allocation as a source of processing difficulty in human sentence comprehension. The paper proposes a simple information-theoretic characterization of processing difficulty as the work incurred by resource reallocation during parallel, incremental, probabilistic disambiguation in sentence comprehension, and demonstrates its equivalence to the theory of Hale [Hale, J. (2001). A probabilistic Earley parser as a psycholinguistic model. In Proceedings of NAACL (Vol. 2, pp. 159-166)], in which the difficulty of a word is proportional to its surprisal (its negative log-probability) in the context within which it appears. This proposal subsumes and clarifies findings that high-constraint contexts can facilitate lexical processing, and connects these findings to well-known models of parallel constraint-based comprehension. In addition, the theory leads to a number of specific predictions about the role of expectation in syntactic comprehension, including the reversal of locality-based difficulty patterns in syntactically constrained contexts, and conditions under which increased ambiguity facilitates processing. The paper examines a range of established results bearing on these predictions, and shows that they are largely consistent with the surprisal theory.
            Bookmark

            Author and article information

            Affiliations
            [1 ] Department of Computer Science, Columbia University, New York, NY 10027, USA. julia@cs.columbia.edu.
            [2 ] Department of Linguistics, Stanford University, Stanford, CA 94305-2150, USA. Department of Computer Science, Stanford University, Stanford, CA 94305-9020, USA.
            Journal
            Science
            Science (New York, N.Y.)
            1095-9203
            0036-8075
            Jul 17 2015
            : 349
            : 6245
            349/6245/261 10.1126/science.aaa8685 26185244
            Copyright © 2015, American Association for the Advancement of Science.

            Comments

            Comment on this article