10
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      The Extreme Value Machine.

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          It is often desirable to be able to recognize when inputs to a recognition function learned in a supervised manner correspond to classes unseen at training time. With this ability, new class labels could be assigned to these inputs by a human operator, allowing them to be incorporated into the recognition function - ideally under an efficient incremental update mechanism. While good algorithms that assume inputs from a fixed set of classes exist, e.g., artificial neural networks and kernel machines, it is not immediately obvious how to extend them to perform incremental learning in the presence of unknown query classes. Existing algorithms take little to no distributional information into account when learning recognition functions and lack a strong theoretical foundation. We address this gap by formulating a novel, theoretically sound classifier - the Extreme Value Machine (EVM). The EVM has a wellgrounded interpretation derived from statistical Extreme Value Theory (EVT), and is the first classifier to be able to perform nonlinear kernel-free variable bandwidth incremental learning. Compared to other classifiers in the same deep network derived feature space, the EVM is accurate and efficient on an established benchmark partition of the ImageNet dataset.

          Related collections

          Most cited references29

          • Record: found
          • Abstract: not found
          • Article: not found

          Multidimensional binary search trees used for associative searching

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            A Survey of Outlier Detection Methodologies

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Attribute-based classification for zero-shot visual object categorization.

              We study the problem of object recognition for categories for which we have no training examples, a task also called zero--data or zero-shot learning. This situation has hardly been studied in computer vision research, even though it occurs frequently; the world contains tens of thousands of different object classes, and image collections have been formed and suitably annotated for only a few of them. To tackle the problem, we introduce attribute-based classification: Objects are identified based on a high-level description that is phrased in terms of semantic attributes, such as the object's color or shape. Because the identification of each such property transcends the specific learning task at hand, the attribute classifiers can be prelearned independently, for example, from existing image data sets unrelated to the current task. Afterward, new classes can be detected based on their attribute representation, without the need for a new training phase. In this paper, we also introduce a new data set, Animals with Attributes, of over 30,000 images of 50 animal classes, annotated with 85 semantic attributes. Extensive experiments on this and two more data sets show that attribute-based classification indeed is able to categorize images without access to any training images of the target classes.
                Bookmark

                Author and article information

                Journal
                IEEE Trans Pattern Anal Mach Intell
                IEEE transactions on pattern analysis and machine intelligence
                Institute of Electrical and Electronics Engineers (IEEE)
                1939-3539
                0098-5589
                May 23 2017
                Article
                10.1109/TPAMI.2017.2707495
                28541894
                64372d26-01ae-401f-9518-46ddd05fbf63
                History

                Comments

                Comment on this article