8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Training much deeper spiking neural networks with a small number of time-steps.

      Read this article at

      ScienceOpenPublisherPubMed
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Spiking Neural Network (SNN) is a promising energy-efficient neural architecture when implemented on neuromorphic hardware. The Artificial Neural Network (ANN) to SNN conversion method, which is the most effective SNN training method, has successfully converted moderately deep ANNs to SNNs with satisfactory performance. However, this method requires a large number of time-steps, which hurts the energy efficiency of SNNs. How to effectively covert a very deep ANN (e.g., more than 100 layers) to an SNN with a small number of time-steps remains a difficult task. To tackle this challenge, this paper makes the first attempt to propose a novel error analysis framework that takes both the "quantization error" and the "deviation error" into account, which comes from the discretization of SNN dynamicsthe neuron's coding scheme and the inconstant input currents at intermediate layers, respectively. Particularly, our theories reveal that the "deviation error" depends on both the spike threshold and the input variance. Based on our theoretical analysis, we further propose the Threshold Tuning and Residual Block Restructuring (TTRBR) method that can convert very deep ANNs (>100 layers) to SNNs with negligible accuracy degradation while requiring only a small number of time-steps. With very deep networks, our TTRBR method achieves state-of-the-art (SOTA) performance on the CIFAR-10, CIFAR-100, and ImageNet classification tasks.

          Related collections

          Most cited references44

          • Record: found
          • Abstract: found
          • Article: not found

          Spike timing-dependent plasticity: a Hebbian learning rule.

          Spike timing-dependent plasticity (STDP) as a Hebbian synaptic learning rule has been demonstrated in various neural circuits over a wide spectrum of species, from insects to humans. The dependence of synaptic modification on the order of pre- and postsynaptic spiking within a critical window of tens of milliseconds has profound functional implications. Over the past decade, significant progress has been made in understanding the cellular mechanisms of STDP at both excitatory and inhibitory synapses and of the associated changes in neuronal excitability and synaptic integration. Beyond the basic asymmetric window, recent studies have also revealed several layers of complexity in STDP, including its dependence on dendritic location, the nonlinear integration of synaptic modification induced by complex spike trains, and the modulation of STDP by inhibitory and neuromodulatory inputs. Finally, the functional consequences of STDP have been examined directly in an increasing number of neural circuits in vivo.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Artificial brains. A million spiking-neuron integrated circuit with a scalable communication network and interface.

            Inspired by the brain's structure, we have developed an efficient, scalable, and flexible non-von Neumann architecture that leverages contemporary silicon technology. To demonstrate, we built a 5.4-billion-transistor chip with 4096 neurosynaptic cores interconnected via an intrachip network that integrates 1 million programmable spiking neurons and 256 million configurable synapses. Chips can be tiled in two dimensions via an interchip communication interface, seamlessly scaling the architecture to a cortexlike sheet of arbitrary size. The architecture is well suited to many applications that use complex neural networks in real time, for example, multiobject detection and classification. With 400-pixel-by-240-pixel video input at 30 frames per second, the chip consumes 63 milliwatts. Copyright © 2014, American Association for the Advancement of Science.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Loihi: A Neuromorphic Manycore Processor with On-Chip Learning

                Bookmark

                Author and article information

                Journal
                Neural Netw
                Neural networks : the official journal of the International Neural Network Society
                Elsevier BV
                1879-2782
                0893-6080
                Sep 2022
                : 153
                Affiliations
                [1 ] The Chinese University of Hong Kong, Shenzhen, China; Shenzhen Research Institute of Big Data, Shenzhen 518115, China. Electronic address: qingyanmeng@link.cuhk.edu.cn.
                [2 ] Center for Data Science, Peking University, China. Electronic address: yanshen@pku.edu.cn.
                [3 ] Key Laboratory of Machine Perception (MOE), School of Artificial Intelligence, Peking University, China. Electronic address: mingqing_xiao@pku.edu.cn.
                [4 ] Key Laboratory of Machine Perception (MOE), School of Artificial Intelligence, Peking University, China; Institute for Artificial Intelligence, Peking University, China. Electronic address: yisen.wang@pku.edu.cn.
                [5 ] Key Laboratory of Machine Perception (MOE), School of Artificial Intelligence, Peking University, China; Institute for Artificial Intelligence, Peking University, China; Peng Cheng Laboratory, China. Electronic address: zlin@pku.edu.cn.
                [6 ] The Chinese University of Hong Kong, Shenzhen, China; Shenzhen Research Institute of Big Data, Shenzhen 518115, China. Electronic address: luozq@cuhk.edu.cn.
                Article
                S0893-6080(22)00206-4
                10.1016/j.neunet.2022.06.001
                35759953
                19d3d38a-d0bc-42d3-b103-e1fad3a41470
                History

                Spiking neural networks,ANN-to-SNN conversion,Conversion error analysis

                Comments

                Comment on this article