9
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Supervised Learning in All FeFET-Based Spiking Neural Network: Opportunities and Challenges

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The two possible pathways toward artificial intelligence (AI)—(i) neuroscience-oriented neuromorphic computing [like spiking neural network (SNN)] and (ii) computer science driven machine learning (like deep learning) differ widely in their fundamental formalism and coding schemes ( Pei et al., 2019). Deviating from traditional deep learning approach of relying on neuronal models with static nonlinearities, SNNs attempt to capture brain-like features like computation using spikes. This holds the promise of improving the energy efficiency of the computing platforms. In order to achieve a much higher areal and energy efficiency compared to today’s hardware implementation of SNN, we need to go beyond the traditional route of relying on CMOS-based digital or mixed-signal neuronal circuits and segregation of computation and memory under the von Neumann architecture. Recently, ferroelectric field-effect transistors (FeFETs) are being explored as a promising alternative for building neuromorphic hardware by utilizing their non-volatile nature and rich polarization switching dynamics. In this work, we propose an all FeFET-based SNN hardware that allows low-power spike-based information processing and co-localized memory and computing (a.k.a. in-memory computing). We experimentally demonstrate the essential neuronal and synaptic dynamics in a 28 nm high-K metal gate FeFET technology. Furthermore, drawing inspiration from the traditional machine learning approach of optimizing a cost function to adjust the synaptic weights, we implement a surrogate gradient (SG) learning algorithm on our SNN platform that allows us to perform supervised learning on MNIST dataset. As such, we provide a pathway toward building energy-efficient neuromorphic hardware that can support traditional machine learning algorithms. Finally, we undertake synergistic device-algorithm co-design by accounting for the impacts of device-level variation (stochasticity) and limited bit precision of on-chip synaptic weights (available analog states) on the classification accuracy.

          Related collections

          Most cited references65

          • Record: found
          • Abstract: found
          • Article: not found

          Simple model of spiking neurons.

          A model is presented that reproduces spiking and bursting behavior of known types of cortical neurons. The model combines the biologically plausibility of Hodgkin-Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons. Using this model, one can simulate tens of thousands of spiking cortical neurons in real time (1 ms resolution) using a desktop PC.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Noise in the nervous system.

            Noise--random disturbances of signals--poses a fundamental problem for information processing and affects all aspects of nervous-system function. However, the nature, amount and impact of noise in the nervous system have only recently been addressed in a quantitative manner. Experimental and computational methods have shown that multiple noise sources contribute to cellular and behavioural trial-to-trial variability. We review the sources of noise in the nervous system, from the molecular to the behavioural level, and show how noise contributes to trial-to-trial variability. We highlight how noise affects neuronal networks and the principles the nervous system applies to counter detrimental effects of noise, and briefly discuss noise's potential benefits.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing.

              Brain-inspired computing is an emerging field, which aims to extend the capabilities of information technology beyond digital logic. A compact nanoscale device, emulating biological synapses, is needed as the building block for brain-like computational systems. Here, we report a new nanoscale electronic synapse based on technologically mature phase change materials employed in optical data storage and nonvolatile memory applications. We utilize continuous resistance transitions in phase change materials to mimic the analog nature of biological synapses, enabling the implementation of a synaptic learning rule. We demonstrate different forms of spike-timing-dependent plasticity using the same nanoscale synapse with picojoule level energy consumption.
                Bookmark

                Author and article information

                Contributors
                Journal
                Front Neurosci
                Front Neurosci
                Front. Neurosci.
                Frontiers in Neuroscience
                Frontiers Media S.A.
                1662-4548
                1662-453X
                24 June 2020
                2020
                : 14
                : 634
                Affiliations
                [1] 1Department of Electrical Engineering, College of Engineering, University of Notre Dame , Notre Dame, IN, United States
                [2] 2Department of Computer Science and Engineering, College of Engineering, University of Notre Dame , Notre Dame, IN, United States
                [3] 3Department of Microsystems Engineering, Rochester Institute of Technology , Rochester, NY, United States
                Author notes

                Edited by: Kaushik Roy, Purdue University, United States

                Reviewed by: Guoqi Li, Tsinghua University, China; Lyes Khacef, Université Côte d’Azur, France

                *Correspondence: Sourav Dutta, sdutta4@ 123456nd.edu

                This article was submitted to Neuromorphic Engineering, a section of the journal Frontiers in Neuroscience

                Article
                10.3389/fnins.2020.00634
                7327100
                b15b2b1d-f362-4440-b14d-b3ba89e35525
                Copyright © 2020 Dutta, Schafer, Gomez, Ni, Joshi and Datta.

                This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

                History
                : 16 January 2020
                : 22 May 2020
                Page count
                Figures: 5, Tables: 2, Equations: 10, References: 76, Pages: 14, Words: 0
                Categories
                Neuroscience
                Original Research

                Neurosciences
                neuromorphic computing,supervised learning,surrogate gradient learning,ferroelectric fet,spiking neural network,spiking neuron,analog synapse

                Comments

                Comment on this article