2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Towards Principled Design of Deep Convolutional Networks: Introducing SimpNet

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Major winning Convolutional Neural Networks (CNNs), such as VGGNet, ResNet, DenseNet, \etc, include tens to hundreds of millions of parameters, which impose considerable computation and memory overheads. This limits their practical usage in training and optimizing for real-world applications. On the contrary, light-weight architectures, such as SqueezeNet, are being proposed to address this issue. However, they mainly suffer from low accuracy, as they have compromised between the processing power and efficiency. These inefficiencies mostly stem from following an ad-hoc designing procedure. In this work, we discuss and propose several crucial design principles for an efficient architecture design and elaborate intuitions concerning different aspects of the design procedure. Furthermore, we introduce a new layer called {\it SAF-pooling} to improve the generalization power of the network while keeping it simple by choosing best features. Based on such principles, we propose a simple architecture called {\it SimpNet}. We empirically show that SimpNet provides a good trade-off between the computation/memory efficiency and the accuracy solely based on these primitive but crucial principles. SimpNet outperforms the deeper and more complex architectures such as VGGNet, ResNet, WideResidualNet \etc, on several well-known benchmarks, while having 2 to 25 times fewer number of parameters and operations. We obtain state-of-the-art results (in terms of a balance between the accuracy and the number of involved parameters) on standard datasets, such as CIFAR10, CIFAR100, MNIST and SVHN. The implementations are available at \href{url}{https://github.com/Coderx7/SimpNet}.

          Related collections

          Most cited references4

          • Record: found
          • Abstract: not found
          • Article: not found

          Polynomial Theory of Complex Systems

            Bookmark
            • Record: found
            • Abstract: not found
            • Book Chapter: not found

            Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition

              Bookmark
              • Record: found
              • Abstract: not found
              • Book Chapter: not found

              Learned-Norm Pooling for Deep Feedforward and Recurrent Neural Networks

                Bookmark

                Author and article information

                Journal
                17 February 2018
                Article
                1802.06205
                8b8830cf-e033-49ee-a288-bb0e5fe85374

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                The Submitted version to the IEEE TIP on December 2017, replaced high resolution images with low-res counterparts due to arXiv size limitation, 19 pages
                cs.CV

                Comments

                Comment on this article