11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm \(ESDACD\), a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number \(\kappa\) of the local functions as well as the network topology and delays. Under mild assumptions on the topology of the graph, \(ESDACD\) takes a time \(O((\tau_{\max} + \Delta_{\max})\sqrt{{\kappa}/{\gamma}}\ln(\epsilon^{-1}))\) to reach a precision \(\epsilon\) where \(\gamma\) is the spectral gap of the graph, \(\tau_{\max}\) the maximum communication delay and \(\Delta_{\max}\) the maximum computation time. Therefore, it matches the rate of \(SSDA\), which is optimal when \(\tau_{\max} = \Omega\left(\Delta_{\max}\right)\). Applying \(ESDACD\) to quadratic local functions leads to an accelerated randomized gossip algorithm of rate \(O( \sqrt{\theta_{\rm gossip}/n})\) where \(\theta_{\rm gossip}\) is the rate of the standard randomized gossip. To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.

          Related collections

          Most cited references22

          • Record: found
          • Abstract: not found
          • Article: not found

          Distributed Subgradient Methods for Multi-Agent Optimization

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Randomized gossip algorithms

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems

                Bookmark

                Author and article information

                Journal
                05 October 2018
                Article
                1810.02660
                e939652d-05a0-4008-b4cb-b29b5e4155c9

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                math.OC cs.DC cs.LG

                Numerical methods,Artificial intelligence,Networking & Internet architecture

                Comments

                Comment on this article