73
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Stochastic Quasi-Newton Method for Large-Scale Optimization

      Preprint
      , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The question of how to incorporate curvature information in stochastic approximation methods is challenging. The direct application of classical quasi- Newton updating techniques for deterministic optimization leads to noisy curvature estimates that have harmful effects on the robustness of the iteration. In this paper, we propose a stochastic quasi-Newton method that is efficient, robust and scalable. It employs the classical BFGS update formula in its limited memory form, and is based on the observation that it is beneficial to collect curvature information pointwise, and at regular intervals, through (sub-sampled) Hessian-vector products. This technique differs from the classical approach that would compute differences of gradients, and where controlling the quality of the curvature estimates can be difficult. We present numerical results on problems arising in machine learning that suggest that the proposed method shows much promise.

          Related collections

          Author and article information

          Journal
          2014-01-27
          2015-02-18
          Article
          1401.7020
          910fcef2-abd5-412c-8190-11b69b6d1f41

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          math.OC cs.LG stat.ML

          Numerical methods,Machine learning,Artificial intelligence
          Numerical methods, Machine learning, Artificial intelligence

          Comments

          Comment on this article