17
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Learning feed-forward one-shot learners

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          One-shot learning is usually tackled by using generative models or discriminative embeddings. Discriminative methods based on deep learning, which are very effective in other learning scenarios, are ill-suited for one-shot learning as they need large amounts of training data. In this paper, we propose a method to learn the parameters of a deep model in one shot. We construct the learner as a second deep network, called a learnet, which predicts the parameters of a pupil network from a single exemplar. In this manner we obtain an efficient feed-forward one-shot learner, trained end-to-end by minimizing a one-shot classification objective in a learning to learn formulation. In order to make the construction feasible, we propose a number of factorizations of the parameters of the pupil network. We demonstrate encouraging results by learning characters from single exemplars in Omniglot, and by tracking visual objects from a single initial exemplar in the Visual Object Tracking benchmark.

          Related collections

          Author and article information

          Journal
          2016-06-16
          Article
          1606.05233
          8802c2b8-bb51-4ba1-b7a8-0a43a82c39b9

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          The first three authors contributed equally, and are listed in alphabetical order
          cs.CV cs.LG

          Computer vision & Pattern recognition,Artificial intelligence
          Computer vision & Pattern recognition, Artificial intelligence

          Comments

          Comment on this article