20
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Optimal Reduced Isotonic Regression

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Isotonic regression is a shape-constrained nonparametric regression in which the regression is an increasing step function. For \(n\) data points, the number of steps in the isotonic regression may be as large as \(n\). As a result, standard isotonic regression has been criticized as overfitting the data or making the representation too complicated. So-called "reduced" isotonic regression constrains the outcome to be a specified number of steps \(b\), \(b \leq n\). However, because the previous algorithms for finding the reduced \(L_2\) regression took \(\Theta(n+bm^2)\) time, where \(m\) is the number of steps of the unconstrained isotonic regression, researchers felt that the algorithms were too slow and instead used approximations. Other researchers had results that were approximations because they used a greedy top-down approach. Here we give an algorithm to find an exact solution in \(\Theta(n+bm)\) time, and a simpler algorithm taking \(\Theta(n+b m \log m)\) time. These algorithms also determine optimal \(k\)-means clustering of weighted 1-dimensional data.

          Related collections

          Author and article information

          Journal
          08 December 2014
          Article
          1412.2844
          c05b5b86-f826-429a-84b3-b3892f9ed3b0

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          stat.CO cs.DS

          Comments

          Comment on this article