Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
2
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: not found

      Beyond Reward Prediction Errors: Human Striatum Updates Rule Values During Learning

      Preprint
      , , , ,
      bioRxiv

      Read this article at

      ScienceOpenPublisher
      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Humans naturally group the world into coherent categories defined by membership rules. Rules can be learned implicitly by building stimulus-response associations using reinforcement learning (RL) or by using explicit reasoning. We tested if striatum, in which activation reliably scales with reward prediction error, would track prediction errors in a task that required explicit rule generation. Using functional magnetic resonance imaging during a categorization task, we show that striatal responses to feedback scale with a surprise signal derived from a Bayesian rule-learning model. We also find that striatal feedback responses are inconsistent with RL prediction error and demonstrate that striatum and caudal inferior frontal sulcus (cIFS) are involved in updating the likelihood of discriminative rules. We conclude that the striatum, in cooperation with the cIFS, is involved in updating the values assigned to categorization rules, rather than representing reward prediction errors.

          Related collections

          Author and article information

          Journal
          bioRxiv
          March 09 2017
          Article
          10.1101/115253
          d3543754-721d-408a-82a2-dc93dfbd58c8
          © 2017
          History

          Molecular medicine,Neurosciences
          Molecular medicine, Neurosciences

          Comments

          Comment on this article