4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: not found
      • Article: not found

      Light-Stimulatable Molecules/Nanoparticles Networks for Switchable Logical Functions and Reservoir Computing

      Read this article at

      ScienceOpenPublisher
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references50

          • Record: found
          • Abstract: not found
          • Article: not found

          A study of the nucleation and growth processes in the synthesis of colloidal gold

            • Record: found
            • Abstract: found
            • Article: not found

            Real-time computing without stable states: a new framework for neural computation based on perturbations.

            A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
              • Record: found
              • Abstract: found
              • Article: not found

              Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication.

              We present a method for learning nonlinear systems, echo state networks (ESNs). ESNs employ artificial recurrent neural networks in a way that has recently been proposed independently as a learning mechanism in biological brains. The learning method is computationally efficient and easy to use. On a benchmark task of predicting a chaotic time series, accuracy is improved by a factor of 2400 over previous techniques. The potential for engineering applications is illustrated by equalizing a communication channel, where the signal error rate is improved by two orders of magnitude.

                Author and article information

                Journal
                Advanced Functional Materials
                Adv. Funct. Mater.
                Wiley
                1616301X
                September 2018
                September 2018
                August 06 2018
                : 28
                : 39
                : 1801506
                Affiliations
                [1 ]Institute of Electronics; Microelectronics and Nanotechnology (IEMN); CNRS; University of Lille; Av. Poincaré 59652 cedex Villeneuve d'Ascq France
                [2 ]Department of Physics; University of Basel; Klingelbergstrasse 82 4056 Basel Switzerland
                [3 ]Transport at Nanoscale Interfaces Laboratory; Empa; Swiss Federal Laboratories for Material Science and Technology; Überlandstrasse 129 8600 Dübendorf Switzerland
                Article
                10.1002/adfm.201801506
                c22b20fb-4cfb-4eaf-8422-a991cf33d477
                © 2018

                http://doi.wiley.com/10.1002/tdm_license_1.1

                http://onlinelibrary.wiley.com/termsAndConditions#vor

                History

                Comments

                Comment on this article

                Related Documents Log