Blog
About

6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      A 128 channel Extreme Learning Machine based Neural Decoder for Brain Machine Interfaces

      Preprint

      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Currently, state-of-the-art motor intention decoding algorithms in brain-machine interfaces are mostly implemented on a PC and consume significant amount of power. A machine learning co-processor in 0.35um CMOS for motor intention decoding in brain-machine interfaces is presented in this paper. Using Extreme Learning Machine algorithm and low-power analog processing, it achieves an energy efficiency of 290 GMACs/W at a classification rate of 50 Hz. The learning in second stage and corresponding digitally stored coefficients are used to increase robustness of the core analog processor. The chip is verified with neural data recorded in monkey finger movements experiment, achieving a decoding accuracy of 99.3% for movement type. The same co-processor is also used to decode time of movement from asynchronous neural spikes. With time-delayed feature dimension enhancement, the classification accuracy can be increased by 5% with limited number of input channels. Further, a sparsity promoting training scheme enables reduction of number of programmable weights by ~2X.

          Related collections

          Author and article information

          Journal
          2015-09-22
          2015-09-27
          Article
          10.1109/TBCAS.2015.2483618
          1509.07450

          http://arxiv.org/licenses/nonexclusive-distrib/1.0/

          Custom metadata
          13 pages, 17 figures, accepted by IEEE Transactions on Biomedical Circuits and Systems, 2015
          cs.LG cs.HC

          Artificial intelligence, Human-computer-interaction

          Comments

          Comment on this article