294
views
0
recommends
+1 Recommend
0 collections
    6
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Scaling Up Distributed Stochastic Gradient Descent Using Variance Reduction

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Variance reduction stochastic gradient descent methods enable minimization of model fitting problems involving big datasets with low iteration complexity and fast asymptotic convergence rates. However, they scale poorly in distributed settings. In this paper, we propose a highly parallel variance reduction method, CentralVR, with performance that scales linearly with the number of worker nodes. We also propose distributed versions of popular variance reduction methods that support a high degree of parallelization. Unlike existing distributed stochastic gradient schemes, CentralVR exhibits linear performance gains up to thousands of cores for massive datasets.

          Related collections

          Author and article information

          Comments

          Comment on this article