74
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss and Quantile Regression

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          We propose a semismooth Newton coordinate descent (SNCD) algorithm for elastic-net penalized robust regression with Huber loss and quantile regression. The SNCD is a novel combination of the semismooth Newton and coordinate descent algorithms. It is designed for loss functions with only first order derivatives and is scalable to high-dimensional models. Unlike the standard coordinate descent method, the SNCD updates the regression parameters and the corresponding subdifferentials based on the concept of Newton derivatives. In addition, an adaptive version of the "strong rule" for screening predictors is incorporated to gain extra efficiency. As an important application of the proposed algorithm, we show that the SNCD can be used to compute the solution paths for penalized quantile regression. We establish the convergence properties of the algorithm. Through numerical experiments, we demonstrate that the proposed algorithm works well for high-dimensional data with heavy-tailed errors, and that for quantile regression SNCD is considerably faster than the existing method and has better optimization performance. A breast cancer gene expression data set is used to illustrate the proposed algorithm.

          Related collections

          Author and article information

          Journal
          1509.02957

          Machine learning,Mathematical modeling & Computation
          Machine learning, Mathematical modeling & Computation

          Comments

          Comment on this article