19
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Distributional Semantics and Linguistic Theory

      1 , 2
      Annual Review of Linguistics
      Annual Reviews

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Distributional semantics provides multidimensional, graded, empirically induced word representations that successfully capture many aspects of meaning in natural languages, as shown by a large body of research in computational linguistics; yet, its impact in theoretical linguistics has so far been limited. This review provides a critical discussion of the literature on distributional semantics, with an emphasis on methods and results that are relevant for theoretical linguistics, in three areas: semantic change, polysemy and composition, and the grammar–semantics interface (specifically, the interface of semantics with syntax and with derivational morphology). The goal of this review is to foster greater cross-fertilization of theoretical and computational approaches to language as a means to advance our collective knowledge of how it works.

          Related collections

          Most cited references52

          • Record: found
          • Abstract: found
          • Article: not found

          From Frequency to Meaning: Vector Space Models of Semantics

          Computers understand very little of the meaning of human language. This profoundly limits our ability to give instructions to computers, the ability of computers to explain their actions to us, and the ability of computers to analyse and process text. Vector space models (VSMs) of semantics are beginning to address these limits. This paper surveys the use of VSMs for semantic processing of text. We organize the literature on VSMs according to the structure of the matrix in a VSM. There are currently three broad classes of VSMs, based on term-document, word-context, and pair-pattern matrices, yielding three classes of applications. We survey a broad range of applications in these three categories and we take a detailed look at a specific open source project in each category. Our goal in this survey is to show the breadth of applications of VSMs for semantics, to provide a new perspective on VSMs for those who are already familiar with the area, and to provide pointers into the literature for those who are less familiar with the field.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Composition in distributional models of semantics.

            Vector-based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply by using distributional information under the assumption that words occurring within similar contexts are semantically similar. Despite their widespread use, vector-based models are typically directed at representing words in isolation, and methods for constructing representations for phrases or sentences have received little attention in the literature. This is in marked contrast to experimental evidence (e.g., in sentential priming) suggesting that semantic similarity is more complex than simply a relation between isolated words. This article proposes a framework for representing the meaning of word combinations in vector space. Central to our approach is vector composition, which we operationalize in terms of additive and multiplicative functions. Under this framework, we introduce a wide range of composition models that we evaluate empirically on a phrase similarity task.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Explaining human performance in psycholinguistic tasks with models of semantic similarity based on prediction and counting: A review and empirical validation

                Bookmark

                Author and article information

                Journal
                Annual Review of Linguistics
                Annu. Rev. Linguist.
                Annual Reviews
                2333-9683
                2333-9691
                January 14 2020
                January 14 2020
                : 6
                : 1
                : 213-234
                Affiliations
                [1 ]Department of Translation and Language Sciences, Universitat Pompeu Fabra, Barcelona 08018, Spain;
                [2 ]Catalan Institution for Research and Advanced Studies (ICREA), Barcelona 08010, Spain
                Article
                10.1146/annurev-linguistics-011619-030303
                f121a597-8ca4-42a9-9d12-4e0ff7a74021
                © 2020
                History

                Comments

                Comment on this article