0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Hard negative examples are hard, but useful

      Preprint
      , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Triplet loss is an extremely common approach to distance metric learning. Representations of images from the same class are optimized to be mapped closer together in an embedding space than representations of images from different classes. Much work on triplet losses focuses on selecting the most useful triplets of images to consider, with strategies that select dissimilar examples from the same class or similar examples from different classes. The consensus of previous research is that optimizing with the \textit{hardest} negative examples leads to bad training behavior. That's a problem -- these hardest negatives are literally the cases where the distance metric fails to capture semantic similarity. In this paper, we characterize the space of triplets and derive why hard negatives make triplet loss training fail. We offer a simple fix to the loss function and show that, with this fix, optimizing with hard negative examples becomes feasible. This leads to more generalizable features, and image retrieval results that outperform state of the art for datasets with high intra-class variance.

          Related collections

          Author and article information

          Journal
          24 July 2020
          Article
          2007.12749
          cb154396-bc3a-404b-877a-879240108ed3

          http://creativecommons.org/publicdomain/zero/1.0/

          History
          Custom metadata
          CV, Triplet loss, Image embedding, 14 pages, 9 figures, ECCV 2020
          cs.CV cs.LG stat.ML

          Computer vision & Pattern recognition,Machine learning,Artificial intelligence

          Comments

          Comment on this article