9
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      A review: Photonics devices, architectures, and algorithms for optical neural computing

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The explosive growth of data and information has motivated various emerging non-von Neumann computational approaches in the More-than-Moore era. Photonics neuromorphic computing has attracted lots of attention due to the fascinating advantages such as high speed, wide bandwidth, and massive parallelism. Here, we offer a review on the optical neural computing in our research groups at the device and system levels. The photonics neuron and photonics synapse plasticity are presented. In addition, we introduce several optical neural computing architectures and algorithms including photonic spiking neural network, photonic convolutional neural network, photonic matrix computation, photonic reservoir computing, and photonic reinforcement learning. Finally, we summarize the major challenges faced by photonic neuromorphic computing, and propose promising solutions and perspectives.

          Related collections

          Most cited references76

          • Record: found
          • Abstract: found
          • Article: not found

          Real-time computing without stable states: a new framework for neural computation based on perturbations.

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Synaptic Modifications in Cultured Hippocampal Neurons: Dependence on Spike Timing, Synaptic Strength, and Postsynaptic Cell Type

            In cultures of dissociated rat hippocampal neurons, persistent potentiation and depression of glutamatergic synapses were induced by correlated spiking of presynaptic and postsynaptic neurons. The relative timing between the presynaptic and postsynaptic spiking determined the direction and the extent of synaptic changes. Repetitive postsynaptic spiking within a time window of 20 msec after presynaptic activation resulted in long-term potentiation (LTP), whereas postsynaptic spiking within a window of 20 msec before the repetitive presynaptic activation led to long-term depression (LTD). Significant LTP occurred only at synapses with relatively low initial strength, whereas the extent of LTD did not show obvious dependence on the initial synaptic strength. Both LTP and LTD depended on the activation of NMDA receptors and were absent in cases in which the postsynaptic neurons were GABAergic in nature. Blockade of L-type calcium channels with nimodipine abolished the induction of LTD and reduced the extent of LTP. These results underscore the importance of precise spike timing, synaptic strength, and postsynaptic cell type in the activity-induced modification of central synapses and suggest that Hebb’s rule may need to incorporate a quantitative consideration of spike timing that reflects the narrow and asymmetric window for the induction of synaptic modification.
              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found
              Is Open Access

              Recent advances in physical reservoir computing: A review

                Bookmark

                Author and article information

                Journal
                Journal of Semiconductors
                J. Semicond.
                IOP Publishing
                1674-4926
                2058-6140
                February 01 2021
                February 01 2021
                : 42
                : 2
                : 023105
                Article
                10.1088/1674-4926/42/2/023105
                2c46a75e-c43f-4c9d-a45f-19bd36c37332
                © 2021

                https://iopscience.iop.org/page/copyright

                History

                Comments

                Comment on this article