24
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Single-Solution Hypervolume Maximization and its use for Improving Generalization of Neural Networks

      Preprint
      ,

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          This paper introduces the hypervolume maximization with a single solution as an alternative to the mean loss minimization. The relationship between the two problems is proved through bounds on the cost function when an optimal solution to one of the problems is evaluated on the other, with a hyperparameter to control the similarity between the two problems. This same hyperparameter allows higher weight to be placed on samples with higher loss when computing the hypervolume's gradient, whose normalized version can range from the mean loss to the max loss. An experiment on MNIST with a neural network is used to validate the theory developed, showing that the hypervolume maximization can behave similarly to the mean loss minimization and can also provide better performance, resulting on a 20% reduction of the classification error on the test set.

          Related collections

          Author and article information

          Journal
          2016-02-02
          Article
          1602.01164
          e4e7b071-2d00-4ddc-9500-f46d3bcc6aee

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          cs.LG cs.NE stat.ML

          Machine learning,Neural & Evolutionary computing,Artificial intelligence
          Machine learning, Neural & Evolutionary computing, Artificial intelligence

          Comments

          Comment on this article

          Related Documents Log