13
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A Delay Learning Algorithm Based on Spike Train Kernels for Spiking Neurons

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neuroscience research confirms that the synaptic delays are not constant, but can be modulated. This paper proposes a supervised delay learning algorithm for spiking neurons with temporal encoding, in which both the weight and delay of a synaptic connection can be adjusted to enhance the learning performance. The proposed algorithm firstly defines spike train kernels to transform discrete spike trains during the learning phase into continuous analog signals so that common mathematical operations can be performed on them, and then deduces the supervised learning rules of synaptic weights and delays by gradient descent method. The proposed algorithm is successfully applied to various spike train learning tasks, and the effects of parameters of synaptic delays are analyzed in detail. Experimental results show that the network with dynamic delays achieves higher learning accuracy and less learning epochs than the network with static delays. The delay learning algorithm is further validated on a practical example of an image classification problem. The results again show that it can achieve a good classification performance with a proper receptive field. Therefore, the synaptic delay learning is significant for practical applications and theoretical researches of spiking neural networks.

          Related collections

          Most cited references45

          • Record: found
          • Abstract: not found
          • Article: not found

          LabelMe: A Database and Web-Based Tool for Image Annotation

            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Error-backpropagation in temporally encoded networks of spiking neurons

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Spiking neural networks.

              Most current Artificial Neural Network (ANN) models are based on highly simplified brain dynamics. They have been used as powerful computational tools to solve complex pattern recognition, function estimation, and classification problems. ANNs have been evolving towards more powerful and more biologically realistic models. In the past decade, Spiking Neural Networks (SNNs) have been developed which comprise of spiking neurons. Information transfer in these neurons mimics the information transfer in biological neurons, i.e., via the precise timing of spikes or a sequence of spikes. To facilitate learning in such networks, new learning algorithms based on varying degrees of biological plausibility have also been developed recently. Addition of the temporal dimension for information encoding in SNNs yields new insight into the dynamics of the human brain and could result in compact representations of large neural networks. As such, SNNs have great potential for solving complicated time-dependent pattern recognition problems because of their inherent dynamic representation. This article presents a state-of-the-art review of the development of spiking neurons and SNNs, and provides insight into their evolution as the third generation neural networks.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                27 March 2019
                2019
                : 13
                : 252
                Affiliations
                College of Computer Science and Engineering, Northwest Normal University , Lanzhou, China
                Author notes

                Edited by: Yansong Chua, Institute for Infocomm Research (A *STAR), Singapore

                Reviewed by: Shaista Hussain, Institute of High Performance Computing (A *STAR), Singapore; Liam P. Maguire, Ulster University, United Kingdom

                *Correspondence: Xianghong Lin linxh@ 123456nwnu.edu.cn

                This article was submitted to Neuromorphic Engineering, a section of the journal Frontiers in Neuroscience

                Article
                10.3389/fnins.2019.00252
                6445871
                30971877
                a2837b52-4018-4187-b15f-0cd67e1f8971
                Copyright © 2019 Wang, Lin and Dang.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 08 October 2018
                : 04 March 2018
                Page count
                Figures: 12, Tables: 3, Equations: 33, References: 47, Pages: 16, Words: 9115
                Categories
                Neuroscience
                Original Research

                Neurosciences
                spiking neural networks,supervised learning,spike train kernels,delay learning,synaptic delays

                Comments

                Comment on this article