Blog
About

  • Record: found
  • Abstract: not found
  • Article: not found

Reservoir Computing Trends

Read this article at

ScienceOpenPublisher
Bookmark
      There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

      Related collections

      Most cited references 39

      • Record: found
      • Abstract: not found
      • Article: not found

      Long Short-Term Memory

        Bookmark
        • Record: found
        • Abstract: found
        • Article: not found

        Learning long-term dependencies with gradient descent is difficult.

        Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in the input/output sequences span long intervals. We show why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases. These results expose a trade-off between efficient learning by gradient descent and latching on information for long periods. Based on an understanding of this problem, alternatives to standard gradient descent are considered.
          Bookmark
          • Record: found
          • Abstract: found
          • Article: not found

          Real-time computing without stable states: a new framework for neural computation based on perturbations.

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
            Bookmark

            Author and article information

            Journal
            KI - Künstliche Intelligenz
            Künstl Intell
            Springer Nature
            0933-1875
            1610-1987
            November 2012
            May 2012
            : 26
            : 4
            : 365-371
            10.1007/s13218-012-0204-5
            © 2012
            Product

            Comments

            Comment on this article