9
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Shortcut-Stacked Sentence Encoders for Multi-Domain Inference

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We present a simple sequential sentence encoder for multi-domain natural language inference. Our encoder is based on stacked bidirectional LSTM-RNNs with shortcut connections and fine-tuning of word embeddings. The overall supervised model uses the above encoder to encode two input sentences into two vectors, and then uses a classifier over the vector combination to label the relationship between these two sentences as that of entailment, contradiction, or neural. Our Shortcut-Stacked sentence encoders achieve strong improvements over existing encoders on matched and mismatched multi-domain natural language inference (top non-ensemble single-model result in the EMNLP RepEval 2017 Shared Task (Nangia et al., 2017)). Moreover, they achieve the new state-of-the-art encoding result on the original SNLI dataset (Bowman et al., 2015).

          Related collections

          Author and article information

          Journal
          07 August 2017
          Article
          1708.02312
          3003dd8c-3169-4f52-a110-7a59e1877d15

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          EMNLP 2017 RepEval Multi-NLI Shared Task (5 pages)
          cs.CL cs.AI cs.LG

          Comments

          Comment on this article