6
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Dataflow Matrix Machines and V-values: a Bridge between Programs and Neural Nets

      Preprint
      ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Dataflow matrix machines generalize neural nets by replacing streams of numbers with streams of vectors (or other kinds of linear streams admitting a notion of linear combination of several streams) and adding a few more changes on top of that, namely arbitrary input and output arities for activation functions, countable-sized networks with finite dynamically changeable active part capable of unbounded growth, and a very expressive self-referential mechanism. While recurrent neural networks are Turing-complete, they form an esoteric programming platform, not conductive for practical general-purpose programming. Dataflow matrix machines are more suitable as a general-purpose programming platform, although it remains to be seen whether this platform can be made fully competitive with more traditional programming platforms currently in use. At the same time, dataflow matrix machines retain the key property of recurrent neural networks: programs are expressed via matrices of real numbers, and continuous changes to those matrices produce arbitrarily small variations in the programs associated with those matrices. Spaces of vector-like elements are of particular importance in this context. In particular, we focus on the vector space \(V\) of finite linear combinations of strings, which can be also understood as the vector space of finite prefix trees with numerical leaves, the vector space of "mixed rank tensors", or the vector space of recurrent maps. This space, and a family of spaces of vector-like elements derived from it, are sufficiently expressive to cover all cases of interest we are currently aware of, and allow a compact and streamlined version of dataflow matrix machines based on a single space of vector-like elements and variadic neurons. We call elements of these spaces V-values. Their role in our context is somewhat similar to the role of S-expressions in Lisp.

          Related collections

          Most cited references3

          • Record: found
          • Abstract: not found
          • Article: not found

          On the Computational Power of Neural Nets

            Bookmark
            • Record: found
            • Abstract: not found
            • Book Chapter: not found

            A ‘Self-Referential’ Weight Matrix

              Bookmark
              • Record: found
              • Abstract: not found
              • Article: not found

              The Digital Computer as a Musical Instrument

                Bookmark

                Author and article information

                Journal
                20 December 2017
                Article
                1712.07447
                b65b3ce0-7e23-45f8-bb09-081add78621e

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                28 pages, 5 figures; appeared in "K + K = 120: Papers dedicated to L\'aszl\'o K\'alm\'an and Andr\'as Kornai on the occasion of their 60th birthdays" Festschrift
                cs.NE cs.PL

                Comments

                Comment on this article