2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Neural Networks Should Be Wide Enough to Learn Disconnected Decision Regions

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In the recent literature the important role of depth in deep learning has been emphasized. In this paper we argue that sufficient width of a feedforward network is equally important by answering the simple question under which conditions the decision regions of a neural network are connected. It turns out that for a class of activation functions including leaky ReLU, neural networks having a pyramidal structure, that is no layer has more hidden units than the input dimension, produce necessarily connected decision regions. This implies that a sufficiently wide layer is necessary to produce disconnected decision regions. We discuss the implications of this result for the construction of neural networks, in particular the relation to the problem of adversarial manipulation of classifiers.

          Related collections

          Most cited references1

          • Record: found
          • Abstract: not found
          • Article: not found

          Multilayer feedforward networks are universal approximators

            Bookmark

            Author and article information

            Journal
            28 February 2018
            Article
            1803.00094
            de7d8c73-3318-44da-874b-6b183b8a06a9

            http://arxiv.org/licenses/nonexclusive-distrib/1.0/

            History
            Custom metadata
            cs.LG cs.AI cs.CV stat.ML

            Comments

            Comment on this article