26
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Stochastic Subgradient Method for Nonsmooth Nonconvex Multi-Level Composition Optimization

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We propose a single time-scale stochastic subgradient method for constrained optimization of a composition of several nonsmooth and nonconvex functions. The functions are assumed to be locally Lipschitz and differentiable in a generalized sense. Only stochastic estimates of the values and generalized derivatives of the functions are used. The method is parameter-free. We prove convergence with probability one of the method, by associating with it a system of differential inclusions and devising a nondifferentiable Lyapunov function for this system. For problems with functions having Lipschitz continuous derivatives, the method finds a point satisfying an optimality measure with error of order \(1/\sqrt{N}\), after executing \(N\) iterations with constant stepsize.

          Related collections

          Author and article information

          Journal
          28 January 2020
          Article
          2001.10669
          b3e3799b-0280-4335-b908-f9926c8a7cc2

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          90C15, 49J52, 62L20
          math.OC

          Numerical methods
          Numerical methods

          Comments

          Comment on this article