423
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations

      1 , 1 , 2
      Neural Computation
      MIT Press - Journals

      Read this article at

      ScienceOpenPublisherPubMed
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.

          Related collections

          Most cited references17

          • Record: found
          • Abstract: found
          • Article: not found

          Organizing principles for a diversity of GABAergic interneurons and synapses in the neocortex.

          A puzzling feature of the neocortex is the rich array of inhibitory interneurons. Multiple neuron recordings revealed numerous electrophysiological-anatomical subclasses of neocortical gamma-aminobutyric acid-ergic (GABAergic) interneurons and three types of GABAergic synapses. The type of synapse used by each interneuron to influence its neighbors follows three functional organizing principles. These principles suggest that inhibitory synapses could shape the impact of different interneurons according to their specific spatiotemporal patterns of activity and that GABAergic interneuron and synapse diversity may enable combinatorial inhibitory effects in the neocortex.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Differential signaling via the same axon of neocortical pyramidal neurons

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Fading memory and the problem of approximating nonlinear operators with Volterra series

                Bookmark

                Author and article information

                Journal
                Neural Computation
                Neural Computation
                MIT Press - Journals
                0899-7667
                1530-888X
                November 01 2002
                November 01 2002
                : 14
                : 11
                : 2531-2560
                Affiliations
                [1 ]Institute for Theoretical Computer Science, Technische Universität Graz; A-8010 Graz, Austria,
                [2 ]Brain Mind Institute, Ecole Polytechnique Federale de Lausanne, CH-1015 Lausanne, Switzerland,
                Article
                10.1162/089976602760407955
                12433288
                751d7b57-6ed9-4ccf-ae1c-2412cfcaab76
                © 2002
                History

                Comments

                Comment on this article