0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Conference Proceedings: not found

      Communication-Efficient Stochastic Gradient Descent Ascent with Momentum Algorithms

      proceedings-article
      1 , 2 , 1
      International Joint Conferences on Artificial Intelligence Organization
      Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23} (IJCAI-2023)
      Artificial Intelligence
      September 19, 2023 - September 25, 2023

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Numerous machine learning models can be formulated as a stochastic minimax optimization problem, such as imbalanced data classification with AUC maximization. Developing efficient algorithms to optimize such kinds of problems is of importance and necessity. However, most existing algorithms restrict their focus on the single-machine setting so that they are incapable of dealing with the large communication overhead in a distributed training system. Moreover, most existing communication-efficient optimization algorithms only focus on the traditional minimization problem, failing to handle the minimax optimization problem. To address these challenging issues, in this paper, we develop two novel communication-efficient stochastic gradient descent ascent with momentum algorithms for the distributed minimax optimization problem, which can significantly reduce the communication cost via the two-way compression scheme. However, the compressed momentum makes it considerably challenging to investigate the convergence rate of our algorithms, especially in the presence of the interaction between the minimization and maximization subproblems. In this paper, we successfully addressed these challenges and established the convergence rate of our algorithms for nonconvex-strongly-concave problems. To the best of our knowledge, our algorithms are the first communication-efficient algorithm with theoretical guarantees for the minimax optimization problem. Finally, we apply our algorithm to the distributed AUC maximization problem for the imbalanced data classification task. Extensive experimental results confirm the efficacy of our algorithm in saving communication costs.

          Related collections

          Author and article information

          Conference
          International Joint Conferences on Artificial Intelligence Organization
          August 2023
          August 2023
          : 4602-4610
          Affiliations
          [1 ]Temple University
          [2 ]Dakota State University
          Article
          10.24963/ijcai.2023/512
          e4f47a50-a912-401b-af26-bdf72dc001c1
          © 2023
          Thirty-Second International Joint Conference on Artificial Intelligence {IJCAI-23}
          IJCAI-2023
          32
          Macau, SAR China
          September 19, 2023 - September 25, 2023
          International Joint Conferences on Artificial Intelligence Organization (IJCAI)
          Artificial Intelligence
          History

          Comments

          Comment on this article