41
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Supervised learning in spiking neural networks with FORCE training

      research-article
      ,
      Nature Communications
      Nature Publishing Group UK

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Populations of neurons display an extraordinary diversity in the behaviors they affect and display. Machine learning techniques have recently emerged that allow us to create networks of model neurons that display behaviors of similar complexity. Here we demonstrate the direct applicability of one such technique, the FORCE method, to spiking neural networks. We train these networks to mimic dynamical systems, classify inputs, and store discrete sequences that correspond to the notes of a song. Finally, we use FORCE training to create two biologically motivated model circuits. One is inspired by the zebra finch and successfully reproduces songbird singing. The second network is motivated by the hippocampus and is trained to store and replay a movie scene. FORCE trained networks reproduce behaviors comparable in complexity to their inspired circuits and yield information not easily obtainable with other techniques, such as behavioral responses to pharmacological manipulations and spike timing statistics.

          Abstract

          FORCE training is a . Here the authors implement FORCE training in models of spiking neuronal networks and demonstrate that these networks can be trained to exhibit different dynamic behaviours.

          Related collections

          Most cited references57

          • Record: found
          • Abstract: found
          • Article: not found

          EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis.

          Evidence is presented that EEG oscillations in the alpha and theta band reflect cognitive and memory performance in particular. Good performance is related to two types of EEG phenomena (i) a tonic increase in alpha but a decrease in theta power, and (ii) a large phasic (event-related) decrease in alpha but increase in theta, depending on the type of memory demands. Because alpha frequency shows large interindividual differences which are related to age and memory performance, this double dissociation between alpha vs. theta and tonic vs. phasic changes can be observed only if fixed frequency bands are abandoned. It is suggested to adjust the frequency windows of alpha and theta for each subject by using individual alpha frequency as an anchor point. Based on this procedure, a consistent interpretation of a variety of findings is made possible. As an example, in a similar way as brain volume does, upper alpha power increases (but theta power decreases) from early childhood to adulthood, whereas the opposite holds true for the late part of the lifespan. Alpha power is lowered and theta power enhanced in subjects with a variety of different neurological disorders. Furthermore, after sustained wakefulness and during the transition from waking to sleeping when the ability to respond to external stimuli ceases, upper alpha power decreases, whereas theta increases. Event-related changes indicate that the extent of upper alpha desynchronization is positively correlated with (semantic) long-term memory performance, whereas theta synchronization is positively correlated with the ability to encode new information. The reviewed findings are interpreted on the basis of brain oscillations. It is suggested that the encoding of new information is reflected by theta oscillations in hippocampo-cortical feedback loops, whereas search and retrieval processes in (semantic) long-term memory are reflected by upper alpha oscillations in thalamo-cortical feedback loops. Copyright 1999 Elsevier Science B.V.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Theta oscillations in the hippocampus.

            Theta oscillations represent the "on-line" state of the hippocampus. The extracellular currents underlying theta waves are generated mainly by the entorhinal input, CA3 (Schaffer) collaterals, and voltage-dependent Ca(2+) currents in pyramidal cell dendrites. The rhythm is believed to be critical for temporal coding/decoding of active neuronal ensembles and the modification of synaptic weights. Nevertheless, numerous critical issues regarding both the generation of theta oscillations and their functional significance remain challenges for future research.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Real-time computing without stable states: a new framework for neural computation based on perturbations.

              A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
                Bookmark

                Author and article information

                Contributors
                c.clopath@imperial.ac.uk
                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group UK (London )
                2041-1723
                20 December 2017
                20 December 2017
                2017
                : 8
                : 2208
                Affiliations
                ISNI 0000 0001 2113 8111, GRID grid.7445.2, Department of Bioengineering, , Imperial College London, Royal School of Mines, ; London, SW7 2AZ UK
                Article
                1827
                10.1038/s41467-017-01827-3
                5738356
                28232747
                c100ee37-d355-4e6a-b50e-800b49cd23d7
                © The Author(s) 2017

                Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

                History
                : 19 December 2016
                : 19 October 2017
                Categories
                Article
                Custom metadata
                © The Author(s) 2017

                Uncategorized
                Uncategorized

                Comments

                Comment on this article