20
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      The Impact of Regularization on High-dimensional Logistic Regression

      Preprint
      , ,

      Read this article at

          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Logistic regression is commonly used for modeling dichotomous outcomes. In the classical setting, where the number of observations is much larger than the number of parameters, properties of the maximum likelihood estimator in logistic regression are well understood. Recently, Sur and Candes have studied logistic regression in the high-dimensional regime, where the number of observations and parameters are comparable, and show, among other things, that the maximum likelihood estimator is biased. In the high-dimensional regime the underlying parameter vector is often structured (sparse, block-sparse, finite-alphabet, etc.) and so in this paper we study regularized logistic regression (RLR), where a convex regularizer that encourages the desired structure is added to the negative of the log-likelihood function. An advantage of RLR is that it allows parameter recovery even for instances where the (unconstrained) maximum likelihood estimate does not exist. We provide a precise analysis of the performance of RLR via the solution of a system of six nonlinear equations, through which any performance metric of interest (mean, mean-squared error, probability of support recovery, etc.) can be explicitly computed. Our results generalize those of Sur and Candes and we provide a detailed study for the cases of \(\ell_2^2\)-RLR and sparse (\(\ell_1\)-regularized) logistic regression. In both cases, we obtain explicit expressions for various performance metrics and can find the values of the regularizer parameter that optimizes the desired performance. The theory is validated by extensive numerical simulations across a range of parameter values and problem instances.

          Related collections

          Most cited references4

          • Record: found
          • Abstract: not found
          • Article: not found

          Evaluating Trauma Care

            • Record: found
            • Abstract: not found
            • Article: not found

            Honest variable selection in linear and logistic regression models via ℓ1 and ℓ1+ℓ2 penalization

              • Record: found
              • Abstract: not found
              • Conference Proceedings: not found

              BER analysis of regularized least squares for BPSK recovery

                Author and article information

                Journal
                09 June 2019
                Article
                1906.03761
                edae5160-5a47-4bfd-91f6-ee7b1fc88ac6

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                stat.ML cs.IT cs.LG math.IT math.PR

                Numerical methods,Information systems & theory,Machine learning,Probability,Artificial intelligence

                Comments

                Comment on this article

                Related Documents Log