73
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We propose an optimization method for minimizing the finite sums of smooth convex functions. Our method incorporates an accelerated gradient descent (AGD) and a stochastic variance reduction gradient (SVRG) in a mini-batch setting. Unlike SVRG, our method can be directly applied to non-strongly and strongly convex problems. We show that our method achieves a lower overall complexity than the recently proposed methods that supports non-strongly convex problems. Moreover, this method has a fast rate of convergence for strongly convex problems. Our experiments show the effectiveness of our method.

          Related collections

          Author and article information

          Journal
          2015-06-09
          2015-06-10
          Article
          1506.03016
          c226b9ab-f0fb-4745-bb67-3fa06f212772

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          [v2] corrected citation to proxSVRG, corrected typos in Figure 1(option2) and 3(R4 -> R3)
          stat.ML cs.LG

          Machine learning,Artificial intelligence
          Machine learning, Artificial intelligence

          Comments

          Comment on this article