27
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Vector Space Models of Word Meaning and Phrase Meaning: A Survey : Vector Space Models of Word and Phrase Meaning

      Language and Linguistics Compass
      Wiley-Blackwell

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references26

          • Record: found
          • Abstract: found
          • Article: not found

          Grounded cognition.

          Grounded cognition rejects traditional views that cognition is computation on amodal symbols in a modular system, independent of the brain's modal systems for perception, action, and introspection. Instead, grounded cognition proposes that modal simulations, bodily states, and situated action underlie cognition. Accumulating behavioral and neural evidence supporting this view is reviewed from research on perception, memory, knowledge, language, thought, social cognition, and development. Theories of grounded cognition are also reviewed, as are origins of the area and common misperceptions of it. Theoretical, empirical, and methodological issues are raised whose future treatment is likely to affect the growth and impact of grounded cognition.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Composition in distributional models of semantics.

            Vector-based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply by using distributional information under the assumption that words occurring within similar contexts are semantically similar. Despite their widespread use, vector-based models are typically directed at representing words in isolation, and methods for constructing representations for phrases or sentences have received little attention in the literature. This is in marked contrast to experimental evidence (e.g., in sentential priming) suggesting that semantic similarity is more complex than simply a relation between isolated words. This article proposes a framework for representing the meaning of word combinations in vector space. Central to our approach is vector composition, which we operationalize in terms of additive and multiplicative functions. Under this framework, we introduce a wide range of composition models that we evaluate empirically on a phrase similarity task.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Tensor product variable binding and the representation of symbolic structures in connectionist systems

                Bookmark

                Author and article information

                Journal
                Language and Linguistics Compass
                Wiley-Blackwell
                1749818X
                October 2012
                October 05 2012
                : 6
                : 10
                : 635-653
                Article
                10.1002/lnco.362
                54b9264c-d0d2-4588-87fd-5883e9bf3036
                © 2012

                http://doi.wiley.com/10.1002/tdm_license_1.1

                History

                Comments

                Comment on this article