4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      News without the buzz: reading out weak theta rhythms in the hippocampus

      Preprint
      research-article

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Local field potentials (LFPs) reflect the collective dynamics of neural populations, yet their exact relationship to neural codes remains unknown 1 . One notable exception is the theta rhythm of the rodent hippocampus, which seems to provide a reference clock to decode the animal’s position from spatiotemporal patterns of neuronal spiking 2 or LFPs 3 . But when the animal stops, theta becomes irregular 4 , potentially indicating the breakdown of temporal coding by neural populations. Here we show that no such breakdown occurs, introducing an artificial neural network that can recover position-tuned rhythmic patterns (pThetas) without relying on the more prominent theta rhythm as a reference clock. pTheta and theta preferentially correlate with place cell and interneuron spiking, respectively. When rats forage in an open field, pTheta is jointly tuned to position and head orientation, a property not seen in individual place cells but expected to emerge from place cell sequences 5 . Our work demonstrates that weak and intermittent oscillations, as seen in many brain regions and species, can carry behavioral information commensurate with population spike codes.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: found
          • Article: not found

          Theta oscillations in the hippocampus.

          Theta oscillations represent the "on-line" state of the hippocampus. The extracellular currents underlying theta waves are generated mainly by the entorhinal input, CA3 (Schaffer) collaterals, and voltage-dependent Ca(2+) currents in pyramidal cell dendrites. The rhythm is believed to be critical for temporal coding/decoding of active neuronal ensembles and the modification of synaptic weights. Nevertheless, numerous critical issues regarding both the generation of theta oscillations and their functional significance remain challenges for future research.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Removing electroencephalographic artifacts by blind source separation.

            Eye movements, eye blinks, cardiac signals, muscle noise, and line noise present serious problems for electroencephalographic (EEG) interpretation and analysis when rejecting contaminated EEG segments results in an unacceptable data loss. Many methods have been proposed to remove artifacts from EEG recordings, especially those arising from eye movements and blinks. Often regression in the time or frequency domain is performed on parallel EEG and electrooculographic (EOG) recordings to derive parameters characterizing the appearance and spread of EOG artifacts in the EEG channels. Because EEG and ocular activity mix bidirectionally, regressing out eye artifacts inevitably involves subtracting relevant EEG signals from each record as well. Regression methods become even more problematic when a good regressing channel is not available for each artifact source, as in the case of muscle artifacts. Use of principal component analysis (PCA) has been proposed to remove eye artifacts from multichannel EEG. However, PCA cannot completely separate eye artifacts from brain signals, especially when they have comparable amplitudes. Here, we propose a new and generally applicable method for removing a wide variety of artifacts from EEG records based on blind source separation by independent component analysis (ICA). Our results on EEG data collected from normal and autistic subjects show that ICA can effectively detect, separate, and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably with those obtained using regression and PCA methods. ICA can also be used to analyze blink-related brain activity.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: found
              Is Open Access

              Deep learning with convolutional neural networks for EEG decoding and visualization

              Abstract Deep learning with convolutional neural networks (deep ConvNets) has revolutionized computer vision through end‐to‐end learning, that is, learning from the raw data. There is increasing interest in using deep ConvNets for end‐to‐end EEG analysis, but a better understanding of how to design and train ConvNets for end‐to‐end EEG decoding and how to visualize the informative EEG features the ConvNets learn is still needed. Here, we studied deep ConvNets with a range of different architectures, designed for decoding imagined or executed tasks from raw EEG. Our results show that recent advances from the machine learning field, including batch normalization and exponential linear units, together with a cropped training strategy, boosted the deep ConvNets decoding performance, reaching at least as good performance as the widely used filter bank common spatial patterns (FBCSP) algorithm (mean decoding accuracies 82.1% FBCSP, 84.0% deep ConvNets). While FBCSP is designed to use spectral power modulations, the features used by ConvNets are not fixed a priori. Our novel methods for visualizing the learned features demonstrated that ConvNets indeed learned to use spectral power modulations in the alpha, beta, and high gamma frequencies, and proved useful for spatially mapping the learned features by revealing the topography of the causal contributions of features in different frequency bands to the decoding decision. Our study thus shows how to design and train ConvNets to decode task‐related information from the raw EEG without handcrafted features and highlights the potential of deep ConvNets combined with advanced visualization techniques for EEG‐based brain mapping. Hum Brain Mapp 38:5391–5420, 2017. © 2017 Wiley Periodicals, Inc.
                Bookmark

                Author and article information

                Journal
                bioRxiv
                BIORXIV
                bioRxiv
                Cold Spring Harbor Laboratory
                23 December 2023
                : 2023.12.22.573160
                Affiliations
                [1 ]Department of Natural Sciences, Pitzer and Scripps Colleges, Claremont, CA
                [2 ]Howard Hughes Medical Institute, Janelia Research Campus, Ashburn, VA
                [3 ]University of Chicago, Chicago, IL
                [4 ]Pitzer College, Claremont, CA
                [5 ]William Alanson White Institute, New York, NY
                [6 ]Howard Hughes Medical Institute, Beth Israel Deaconess Medical Center, Boston, MA
                [7 ]Helen Wills Neuroscience Institute, UC Berkeley, Berkeley, CA
                [8 ]Neuromorphic Computing Lab, Intel Corporation, Santa Clara, CA
                Author notes

                Author contributions The project was originally conceptualized by G.A. and F.S. The behavioral paradigm and data collection were developed by B.L., E.P. and A.L. Data was collected by B.L. TIMBRE, data analysis, and data visualization were developed by G.A. TIMBRE optimization and schematic was developed by S.A. Software repository was created by G.A. and S.A. The manuscript was written by G.A. and F.S. and edited and reviewed by all authors.

                Author information
                http://orcid.org/0000-0001-7300-7586
                http://orcid.org/0000-0002-3105-6766
                http://orcid.org/0000-0001-5518-9590
                http://orcid.org/0000-0002-6738-9263
                Article
                10.1101/2023.12.22.573160
                10769352
                38187593
                8d0af622-2d0a-4edc-8db4-5c8366e07441

                This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, which allows reusers to distribute, remix, adapt, and build upon the material in any medium or format for noncommercial purposes only, and only so long as attribution is given to the creator.

                History
                Categories
                Article

                Comments

                Comment on this article