23
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Mnemonics Training: Multi-Class Incremental Learning without Forgetting

      Preprint
      , , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without forgetting previous ones, potentially leading to catastrophic forgetting of previous concepts. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between classes.

          Related collections

          Author and article information

          Journal
          24 February 2020
          Article
          2002.10211
          fa50917f-4b95-4582-bf30-f7e60c5519b2

          http://creativecommons.org/licenses/by-nc-sa/4.0/

          History
          Custom metadata
          Accepted by CVPR 2020
          cs.CV stat.ML

          Computer vision & Pattern recognition,Machine learning
          Computer vision & Pattern recognition, Machine learning

          Comments

          Comment on this article