9
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Size and depth of monotone neural networks: interpolation and approximation

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Monotone functions and data sets arise in a variety of applications. We study the interpolation problem for monotone data sets: The input is a monotone data set with \(n\) points, and the goal is to find a size and depth efficient monotone neural network, with non negative parameters and threshold units, that interpolates the data set. We show that there are monotone data sets that cannot be interpolated by a monotone network of depth \(2\). On the other hand, we prove that for every monotone data set with \(n\) points in \(\mathbb{R}^d\), there exists an interpolating monotone network of depth \(4\) and size \(O(nd)\). Our interpolation result implies that every monotone function over \([0,1]^d\) can be approximated arbitrarily well by a depth-4 monotone network, improving the previous best-known construction of depth \(d+1\). Finally, building on results from Boolean circuit complexity, we show that the inductive bias of having positive parameters can lead to a super-polynomial blow-up in the number of neurons when approximating monotone functions.

          Related collections

          Author and article information

          Journal
          11 July 2022
          Article
          2207.05275
          aaa0e48f-4534-408c-a6c4-fb3e591aab33

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          History
          Custom metadata
          19 pages
          cs.LG math.OC stat.ML

          Numerical methods,Machine learning,Artificial intelligence
          Numerical methods, Machine learning, Artificial intelligence

          Comments

          Comment on this article