4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Template-Based Posit Multiplication for Training and Inferring in Neural Networks

      Preprint
      , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The posit number system is arguably the most promising and discussed topic in Arithmetic nowadays. The recent breakthroughs claimed by the format proposed by John L. Gustafson have put posits in the spotlight. In this work, we first describe an algorithm for multiplying two posit numbers, even when the number of exponent bits is zero. This configuration, scarcely tackled in literature, is particularly interesting because it allows the deployment of a fast sigmoid function. The proposed multiplication algorithm is then integrated as a template into the well-known FloPoCo framework. Synthesis results are shown to compare with the floating point multiplication offered by FloPoCo as well. Second, the performance of posits is studied in the scenario of Neural Networks in both training and inference stages. To the best of our knowledge, this is the first time that training is done with posit format, achieving promising results for a binary classification problem even with reduced posit configurations. In the inference stage, 8-bit posits are as good as floating point when dealing with the MNIST dataset, but lose some accuracy with CIFAR-10.

          Related collections

          Most cited references11

          • Record: found
          • Abstract: not found
          • Conference Proceedings: not found

          DaDianNao: A Machine-Learning Supercomputer

            Bookmark
            • Record: found
            • Abstract: not found
            • Conference Proceedings: not found

            Quantization and Training of Neural Networks for Efficient Integer-Arithmetic-Only Inference

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              Designing Custom Arithmetic Data Paths with FloPoCo

                Bookmark

                Author and article information

                Journal
                09 July 2019
                Article
                1907.04091
                49f73b03-d05e-471c-b1c4-470d5a914949

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                12 pages
                cs.CV cs.LG

                Computer vision & Pattern recognition,Artificial intelligence
                Computer vision & Pattern recognition, Artificial intelligence

                Comments

                Comment on this article