11
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Phase Transitions, Optimal Errors and Optimality of Message-Passing in Generalized Linear Models

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We consider generalized linear models where an unknown \(n\)-dimensional signal vector is observed through the successive application of a random matrix and a non-linear (possibly probabilistic) componentwise function. We consider the models in the high-dimensional limit, where the observation consists of \(m\) points, and \(m/n {\to} {\alpha}\) where \({\alpha}\) stays finite in the limit \(m, n {\to} {\infty}\). This situation is ubiquitous in applications ranging from supervised machine learning to signal processing. A substantial amount of work suggests that both the inference and learning tasks in these problems have sharp intrinsic limitations when the available data become too scarce or too noisy. Here, we provide rigorous asymptotic predictions for these thresholds through the proof of a simple expression for the mutual information between the observations and the signal. Thanks to this expression we also obtain as a consequence the optimal value of the generalization error in many statistical learning models of interest, such as the teacher-student binary perceptron, and introduce several new models with remarquable properties. We compute these thresholds (or "phase transitions") using ideas from statistical physics that are turned into rigorous methods thanks to a new powerful smart-path interpolation technique called the stochastic interpolation method, which has recently been introduced by two of the authors. Moreover we show that a polynomial-time algorithm refered to as generalized approximate message-passing reaches the optimal generalization performance for a large set of parameters in these problems. Our results clarify the difficulties and challenges one has to face when solving complex high-dimensional statistical problems.

          Related collections

          Author and article information

          Journal
          10 August 2017
          Article
          1708.03395
          d4d951b4-896e-4212-84c7-ee976688b823

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          35 pages, 3 figures
          cs.IT cond-mat.dis-nn cs.AI cs.LG math-ph math.IT math.MP

          Comments

          Comment on this article