0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Perturbing low dimensional activity manifolds in spiking neuronal networks

      research-article
      1 , 2 , 1 , *
      PLoS Computational Biology
      Public Library of Science

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Several recent studies have shown that neural activity in vivo tends to be constrained to a low-dimensional manifold. Such activity does not arise in simulated neural networks with homogeneous connectivity and it has been suggested that it is indicative of some other connectivity pattern in neuronal networks. In particular, this connectivity pattern appears to be constraining learning so that only neural activity patterns falling within the intrinsic manifold can be learned and elicited. Here, we use three different models of spiking neural networks (echo-state networks, the Neural Engineering Framework and Efficient Coding) to demonstrate how the intrinsic manifold can be made a direct consequence of the circuit connectivity. Using this relationship between the circuit connectivity and the intrinsic manifold, we show that learning of patterns outside the intrinsic manifold corresponds to much larger changes in synaptic weights than learning of patterns within the intrinsic manifold. Assuming larger changes to synaptic weights requires extensive learning, this observation provides an explanation of why learning is easier when it does not require the neural activity to leave its intrinsic manifold.

          Author summary

          A network in the brain consists of thousands of neurons. A priori, we expect that the network will have as many degrees of freedom as its number of neurons. Surprisingly, experimental evidence suggests that local brain activity is confined to a subspace spanned by ~10 variables. Here, we employ three established approaches to construct spiking neuronal networks that exhibit low-dimensional activity. Using these models we address a specific experimental observation, namely that monkeys easily can elicit any activity within the subspace but have trouble with any activity outside. Specifically, we show that tasks that requires animals to change the network activity outside the subspace would entail large changes in the neuronal connectivity, and therefore, animals are either slow or not able to acquire such tasks.

          Related collections

          Most cited references33

          • Record: found
          • Abstract: found
          • Article: not found

          Real-time computing without stable states: a new framework for neural computation based on perturbations.

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication.

            We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains. The learning method is computationally efficient and easy to use. On a benchmark task of predicting a chaotic time series, accuracy is improved by a factor of 2400 over previous techniques. The potential for engineering applications is illustrated by equalizing a communication channel, where the signal error rate is improved by two orders of magnitude.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Neural population dynamics during reaching

              Most theories of motor cortex have assumed that neural activity represents movement parameters. This view derives from an analogous approach to primary visual cortex, where neural activity represents patterns of light. Yet it is unclear how well that analogy holds. Single-neuron responses in motor cortex appear strikingly complex, and there is marked disagreement regarding which movement parameters are represented. A better analogy might be with other motor systems, where a common principle is rhythmic neural activity. We found that motor cortex responses during reaching contain a brief but strong oscillatory component, something quite unexpected for a non-periodic behavior. Oscillation amplitude and phase followed naturally from the preparatory state, suggesting a mechanistic role for preparatory neural activity. These results demonstrate unexpected yet surprisingly simple structure in the population response. That underlying structure explains many of the confusing features of individual-neuron responses.
                Bookmark

                Author and article information

                Contributors
                Role: Data curationRole: Formal analysisRole: InvestigationRole: MethodologyRole: Writing – original draftRole: Writing – review & editing
                Role: ConceptualizationRole: Funding acquisitionRole: MethodologyRole: SupervisionRole: ValidationRole: Writing – original draftRole: Writing – review & editing
                Role: Editor
                Journal
                PLoS Comput Biol
                PLoS Comput. Biol
                plos
                ploscomp
                PLoS Computational Biology
                Public Library of Science (San Francisco, CA USA )
                1553-734X
                1553-7358
                May 2019
                31 May 2019
                : 15
                : 5
                : e1007074
                Affiliations
                [1 ] Dept. of Computational Science and Technology, School of Electrical Engineering and Computer Science, KTH Royal Institute of Technology, Stockholm, Sweden
                [2 ] Dept. of Neuroscience, Karolinska Institutet, Stockholm, Sweden
                UCL, UNITED KINGDOM
                Author notes

                The authors have declared that no competing interests exist.

                Author information
                http://orcid.org/0000-0002-4754-4561
                http://orcid.org/0000-0002-8044-9195
                Article
                PCOMPBIOL-D-17-00774
                10.1371/journal.pcbi.1007074
                6586365
                31150376
                bbf5761a-860f-4506-970a-ddfcec5c70f5
                © 2019 Wärnberg, Kumar

                This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

                History
                : 15 May 2017
                : 7 May 2019
                Page count
                Figures: 6, Tables: 0, Pages: 23
                Funding
                Funded by: StratNeuro Strategic Program in Neuroscience, Sweden
                Award Recipient :
                Funded by: funder-id http://dx.doi.org/10.13039/100008444, Parkinsonfonden;
                Award Recipient :
                Funded by: funder-id http://dx.doi.org/10.13039/501100004359, Vetenskapsrådet;
                Award ID: 2018-03118
                Award Recipient :
                Partial funding from the School of Electrical Engineering and Computer Science, KTH, Parkinsonfonden Sweden, The Strategic Research Area Neuroscience (StratNeuro) program, and Swedish Research Foundation (Vetenskapsrådet) is greatly acknowledged. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
                Categories
                Research Article
                Biology and Life Sciences
                Cell Biology
                Cellular Types
                Animal Cells
                Neurons
                Biology and Life Sciences
                Neuroscience
                Cellular Neuroscience
                Neurons
                Computer and Information Sciences
                Neural Networks
                Biology and Life Sciences
                Neuroscience
                Neural Networks
                Physical Sciences
                Mathematics
                Discrete Mathematics
                Combinatorics
                Permutation
                Biology and Life Sciences
                Physiology
                Electrophysiology
                Membrane Potential
                Action Potentials
                Medicine and Health Sciences
                Physiology
                Electrophysiology
                Membrane Potential
                Action Potentials
                Biology and Life Sciences
                Physiology
                Electrophysiology
                Neurophysiology
                Action Potentials
                Medicine and Health Sciences
                Physiology
                Electrophysiology
                Neurophysiology
                Action Potentials
                Biology and Life Sciences
                Neuroscience
                Neurophysiology
                Action Potentials
                Biology and Life Sciences
                Computational Biology
                Computational Neuroscience
                Coding Mechanisms
                Biology and Life Sciences
                Neuroscience
                Computational Neuroscience
                Coding Mechanisms
                Biology and Life Sciences
                Anatomy
                Nervous System
                Synapses
                Medicine and Health Sciences
                Anatomy
                Nervous System
                Synapses
                Biology and Life Sciences
                Physiology
                Electrophysiology
                Neurophysiology
                Synapses
                Medicine and Health Sciences
                Physiology
                Electrophysiology
                Neurophysiology
                Synapses
                Biology and Life Sciences
                Neuroscience
                Neurophysiology
                Synapses
                Computer and Information Sciences
                Systems Science
                Dynamical Systems
                Physical Sciences
                Mathematics
                Systems Science
                Dynamical Systems
                Engineering and Technology
                Custom metadata
                vor-update-to-uncorrected-proof
                2019-06-20
                The simulation code is available at https://github.com/emiwar/SpikingManifolds.

                Quantitative & Systems biology
                Quantitative & Systems biology

                Comments

                Comment on this article