5
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Unsupervised Learning of Temporal Abstractions with Slot-based Transformers

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          The discovery of reusable sub-routines simplifies decision-making and planning in complex reinforcement learning problems. Previous approaches propose to learn such temporal abstractions in a purely unsupervised fashion through observing state-action trajectories gathered from executing a policy. However, a current limitation is that they process each trajectory in an entirely sequential manner, which prevents them from revising earlier decisions about sub-routine boundary points in light of new incoming information. In this work we propose SloTTAr, a fully parallel approach that integrates sequence processing Transformers with a Slot Attention module and adaptive computation for learning about the number of such sub-routines in an unsupervised fashion. We demonstrate how SloTTAr is capable of outperforming strong baselines in terms of boundary point discovery, even for sequences containing variable amounts of sub-routines, while being up to 7x faster to train on existing benchmarks.

          Related collections

          Author and article information

          Journal
          25 March 2022
          Article
          2203.13573
          ca314eb7-1f73-4cb0-8243-409a373efcde

          http://creativecommons.org/licenses/by/4.0/

          History
          Custom metadata
          26 pages, 8 figures
          cs.LG cs.AI cs.NE

          Neural & Evolutionary computing,Artificial intelligence
          Neural & Evolutionary computing, Artificial intelligence

          Comments

          Comment on this article