0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Task-adaptive physical reservoir computing

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Reservoir computing is a neuromorphic architecture that may offer viable solutions to the growing energy costs of machine learning. In software-based machine learning, computing performance can be readily reconfigured to suit different computational tasks by tuning hyperparameters. This critical functionality is missing in ‘physical’ reservoir computing schemes that exploit nonlinear and history-dependent responses of physical systems for data processing. Here we overcome this issue with a ‘task-adaptive’ approach to physical reservoir computing. By leveraging a thermodynamical phase space to reconfigure key reservoir properties, we optimize computational performance across a diverse task set. We use the spin-wave spectra of the chiral magnet Cu 2OSeO 3 that hosts skyrmion, conical and helical magnetic phases, providing on-demand access to different computational reservoir responses. The task-adaptive approach is applicable to a wide variety of physical systems, which we show in other chiral magnets via above (and near) room-temperature demonstrations in Co 8.5Zn 8.5Mn 3 (and FeGe).

          Related collections

          Most cited references49

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          SciPy 1.0: fundamental algorithms for scientific computing in Python

          SciPy is an open-source scientific computing library for the Python programming language. Since its initial release in 2001, SciPy has become a de facto standard for leveraging scientific algorithms in Python, with over 600 unique code contributors, thousands of dependent packages, over 100,000 dependent repositories and millions of downloads per year. In this work, we provide an overview of the capabilities and development practices of SciPy 1.0 and highlight some recent technical developments.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            Smoothing and Differentiation of Data by Simplified Least Squares Procedures.

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Real-time computing without stable states: a new framework for neural computation based on perturbations.

              A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
                Bookmark

                Author and article information

                Contributors
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                (View ORCID Profile)
                Journal
                Nature Materials
                Nat. Mater.
                Springer Science and Business Media LLC
                1476-1122
                1476-4660
                November 13 2023
                Article
                10.1038/s41563-023-01698-8
                38012388
                46f1a559-ffcb-4c35-9e83-b8e0a6152172
                © 2023

                https://creativecommons.org/licenses/by/4.0

                https://creativecommons.org/licenses/by/4.0

                History

                Comments

                Comment on this article