0
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      DyNCA: Real-time Dynamic Texture Synthesis Using Neural Cellular Automata

      Preprint
      , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Current Dynamic Texture Synthesis (DyTS) models in the literature can synthesize realistic videos. However, these methods require a slow iterative optimization process to synthesize a single fixed-size short video, and they do not offer any post-training control over the synthesis process. We propose Dynamic Neural Cellular Automata (DyNCA), a framework for real-time and controllable dynamic texture synthesis. Our method is built upon the recently introduced NCA models, and can synthesize infinitely-long and arbitrary-size realistic texture videos in real-time. We quantitatively and qualitatively evaluate our model and show that our synthesized videos appear more realistic than the existing results. We improve the SOTA DyTS performance by \(2\sim 4\) orders of magnitude. Moreover, our model offers several real-time and interactive video controls including motion speed, motion direction, and an editing brush tool.

          Related collections

          Author and article information

          Journal
          21 November 2022
          Article
          2211.11417
          37d5e59b-c84f-4303-80e8-7829fbe43b30

          http://creativecommons.org/licenses/by-sa/4.0/

          History
          Custom metadata
          cs.CV cs.GR cs.LG

          Computer vision & Pattern recognition,Artificial intelligence,Graphics & Multimedia design

          Comments

          Comment on this article