15
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Decoding material-specific memory reprocessing during sleep in humans

      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neuronal learning activity is reactivated during sleep but the dynamics of this reactivation in humans are still poorly understood. Here we use multivariate pattern classification to decode electrical brain activity during sleep and determine what type of images participants had viewed in a preceding learning session. We find significant patterns of learning-related processing during rapid eye movement (REM) and non-REM (NREM) sleep, which are generalizable across subjects. This processing occurs in a cyclic fashion during time windows congruous to critical periods of synaptic plasticity. Its spatial distribution over the scalp and relevant frequencies differ between NREM and REM sleep. Moreover, only the strength of reprocessing in slow-wave sleep influenced later memory performance, speaking for at least two distinct underlying mechanisms between these states. We thus show that memory reprocessing occurs in both NREM and REM sleep in humans and that it pertains to different aspects of the consolidation process.

          Abstract

          Neuronal learning activity is reactivated during sleep but the dynamics of this reactivation in humans are still poorly understood. Here the authors show that memory processing occurs during all stages of sleep in humans but that reprocessing of memory content in REM and non-REM sleep has different effects on later memory performance.

          Related collections

          Most cited references75

          • Record: found
          • Abstract: found
          • Article: not found

          EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis

          We have developed a toolbox and graphic user interface, EEGLAB, running under the crossplatform MATLAB environment (The Mathworks, Inc.) for processing collections of single-trial and/or averaged EEG data of any number of channels. Available functions include EEG data, channel and event information importing, data visualization (scrolling, scalp map and dipole model plotting, plus multi-trial ERP-image plots), preprocessing (including artifact rejection, filtering, epoch selection, and averaging), independent component analysis (ICA) and time/frequency decompositions including channel and component cross-coherence supported by bootstrap statistical methods based on data resampling. EEGLAB functions are organized into three layers. Top-layer functions allow users to interact with the data through the graphic interface without needing to use MATLAB syntax. Menu options allow users to tune the behavior of EEGLAB to available memory. Middle-layer functions allow users to customize data processing using command history and interactive 'pop' functions. Experienced MATLAB users can use EEGLAB data structures and stand-alone signal processing functions to write custom and/or batch analysis scripts. Extensive function help and tutorial information are included. A 'plug-in' facility allows easy incorporation of new EEG modules into the main menu. EEGLAB is freely available (http://www.sccn.ucsd.edu/eeglab/) under the GNU public license for noncommercial use and open source development, together with sample data, user tutorial and extensive documentation.
            Bookmark
            • Record: found
            • Abstract: not found
            • Article: not found

            The assessment and analysis of handedness: The Edinburgh inventory

              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              The θ-γ neural code.

              Theta and gamma frequency oscillations occur in the same brain regions and interact with each other, a process called cross-frequency coupling. Here, we review evidence for the following hypothesis: that the dual oscillations form a code for representing multiple items in an ordered way. This form of coding has been most clearly demonstrated in the hippocampus, where different spatial information is represented in different gamma subcycles of a theta cycle. Other experiments have tested the functional importance of oscillations and their coupling. These involve correlation of oscillatory properties with memory states, correlation with memory performance, and effects of disrupting oscillations on memory. Recent work suggests that this coding scheme coordinates communication between brain regions and is involved in sensory as well as memory processes. Copyright © 2013 Elsevier Inc. All rights reserved.
                Bookmark

                Author and article information

                Journal
                Nat Commun
                Nat Commun
                Nature Communications
                Nature Publishing Group
                2041-1723
                17 May 2017
                2017
                : 8
                : 15404
                Affiliations
                [1 ]Medical Psychology and Behavioral Neurobiology, Eberhard Karls Universität Tübingen , Silcherstr. 5, Tübingen 72076, Germany
                [2 ]Bernstein Center for Computational Neuroscience, LMU München , Großhadernerstr. 2, Planegg-Martinsried 82152, Germany
                [3 ]Department of Psychology, LMU München , Leopoldstr. 13, München 80802, Germany
                Author notes
                [*]

                These authors contributed equally to this work.

                Article
                ncomms15404
                10.1038/ncomms15404
                5442370
                28513589
                c9aea6ac-b8f3-4c38-81b2-46a8f63ebdec
                Copyright © 2017, The Author(s)

                This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article's Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/

                History
                : 28 October 2016
                : 27 March 2017
                Categories
                Article

                Uncategorized
                Uncategorized

                Comments

                Comment on this article