55
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs) are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks require vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost two bits, and show that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time.

          Related collections

          Most cited references39

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          Deep Learning in Neural Networks: An Overview

          (2014)
          In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relevant work, much of it from the previous millennium. Shallow and deep learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpropagation), unsupervised learning, reinforcement learning & evolutionary computation, and indirect search for short programs encoding deep and large networks.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing.

            Brain-inspired computing is an emerging field, which aims to extend the capabilities of information technology beyond digital logic. A compact nanoscale device, emulating biological synapses, is needed as the building block for brain-like computational systems. Here, we report a new nanoscale electronic synapse based on technologically mature phase change materials employed in optical data storage and nonvolatile memory applications. We utilize continuous resistance transitions in phase change materials to mimic the analog nature of biological synapses, enabling the implementation of a synaptic learning rule. We demonstrate different forms of spike-timing-dependent plasticity using the same nanoscale synapse with picojoule level energy consumption.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              PyNN: A Common Interface for Neuronal Network Simulators

              Computational neuroscience has produced a diversity of software for simulations of networks of spiking neurons, with both negative and positive consequences. On the one hand, each simulator uses its own programming or configuration language, leading to considerable difficulty in porting models from one simulator to another. This impedes communication between investigators and makes it harder to reproduce and build on the work of others. On the other hand, simulation results can be cross-checked between different simulators, giving greater confidence in their correctness, and each simulator has different optimizations, so the most appropriate simulator can be chosen for a given modelling task. A common programming interface to multiple simulators would reduce or eliminate the problems of simulator diversity while retaining the benefits. PyNN is such an interface, making it possible to write a simulation script once, using the Python programming language, and run it without modification on any supported simulator (currently NEURON, NEST, PCSIM, Brian and the Heidelberg VLSI neuromorphic hardware). PyNN increases the productivity of neuronal network modelling by providing high-level abstraction, by promoting code sharing and reuse, and by providing a foundation for simulator-agnostic analysis, visualization and data-management tools. PyNN increases the reliability of modelling studies by making it much easier to check results on multiple simulators. PyNN is open-source software and is available from http://neuralensemble.org/PyNN.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                09 July 2015
                2015
                : 9
                : 222
                Affiliations
                [1] 1Advanced Processor Technologies Group, School of Computer Science, University of Manchester Manchester, UK
                [2] 2Institute of Neuroinformatics, University of Zurich and ETH Zurich Zurich, Switzerland
                [3] 3Centre National de la Recherche Scientifique UMR 7210, Equipe de Vision et Calcul Naturel, Vision Institute, UMR S968 Inserm, CHNO des Quinze-Vingts, Université Pierre et Marie Curie Paris, France
                Author notes

                Edited by: John V. Arthur, IBM Almaden Research Center, USA

                Reviewed by: Emre O. Neftci, University of California, San Diego, USA; Srinjoy Mitra, Inter University Microelectronics Centre, Belgium; Damien Querlioz, University Paris-Sud, France

                *Correspondence: Evangelos Stromatias, Advanced Processor Technologies Group, School of Computer Science, University of Manchester, Kilburn Building, Oxford Road, Manchester M13 9PL, UK Stromate@ 123456cs.man.ac.uk ;
                Shih-Chii Liu, Institute of Neuroinformatics, University of Zurich and ETH Zurich, Winterthurerstrasse 190, 8057 Zurich, Switzerland shih@ 123456ini.uzh.edu

                This article was submitted to Neuromorphic Engineering, a section of the journal Frontiers in Neuroscience

                †These authors have contributed equally to this work.

                Article
                10.3389/fnins.2015.00222
                4496577
                26217169
                33f22fa9-b7b1-4712-90ad-76a8dc104be3
                Copyright © 2015 Stromatias, Neil, Pfeiffer, Galluppi, Furber and Liu.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 12 March 2015
                : 03 June 2015
                Page count
                Figures: 9, Tables: 2, Equations: 10, References: 62, Pages: 14, Words: 10834
                Funding
                Funded by: SNF Grant
                Award ID: 200021_146608
                Funded by: SNF grant
                Award ID: 200021_135066
                Funded by: EPSRC grant BIMPA
                Award ID: (EP/G015740/1)
                Funded by: EU projects SeeBetter
                Award ID: (FP7-ICT-270324)
                Funded by: EU grant HBP
                Award ID: (FP7-604102)
                Funded by: EU grant BrainScales-Extension
                Award ID: (FP7-287701)
                Categories
                Neuroscience
                Original Research

                Neurosciences
                deep belief networks,spiking neural networks,spinnaker,noise robustness,neuro-inspired hardware

                Comments

                Comment on this article