5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Declarative Experimentation in Information Retrieval using PyTerrier

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The advent of deep machine learning platforms such as Tensorflow and Pytorch, developed in expressive high-level languages such as Python, have allowed more expressive representations of deep neural network architectures. We argue that such a powerful formalism is missing in information retrieval (IR), and propose a framework called PyTerrier that allows advanced retrieval pipelines to be expressed, and evaluated, in a declarative manner close to their conceptual design. Like the aforementioned frameworks that compile deep learning experiments into primitive GPU operations, our framework targets IR platforms as backends in order to execute and evaluate retrieval pipelines. Further, we can automatically optimise the retrieval pipelines to increase their efficiency to suite a particular IR platform backend. Our experiments, conducted on TREC Robust and ClueWeb09 test collections, demonstrate the efficiency benefits of these optimisations for retrieval pipelines involving both the Anserini and Terrier IR platforms.

          Related collections

          Author and article information

          Journal
          28 July 2020
          Article
          10.1145/3409256.3409829
          2007.14271
          e13b8dfd-1aab-4fd5-b8dd-40f843476fa5

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          2020 ACM SIGIR International Conference on the Theory of Information Retrieval (ICTIR '20)
          cs.IR

          Information & Library science
          Information & Library science

          Comments

          Comment on this article