15
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Dynamic Adaptive Computation: Tuning network states to task requirements

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          Neural circuits are able to perform computations under very diverse conditions and requirements. The required computations impose clear constraints on their fine-tuning: a rapid and maximally informative response to stimuli in general requires decorrelated baseline neural activity. Such network dynamics is known as asynchronous-irregular. In contrast, spatio-temporal integration of information requires maintenance and transfer of stimulus information over extended time periods. This can be realized at criticality, a phase transition where correlations, sensitivity and integration time diverge. Being able to flexibly switch, or even combine the above properties in a task-dependent manner would present a clear functional advantage. We propose that cortex operates in a "reverberating regime" because it is particularly favorable for ready adaptation of computational properties to context and task. This reverberating regime enables cortical networks to interpolate between the asynchronous-irregular and the critical state by small changes in effective synaptic strength or excitation-inhibition ratio. These changes directly adapt computational properties, including sensitivity, amplification, integration time and correlation length within the local network. We review recent converging evidence that cortex in vivo operates in the reverberating regime, and that various cortical areas have adapted their integration times to processing requirements. In addition, we propose that neuromodulation enables a fine-tuning of the network, so that local circuits can either decorrelate or integrate, and quench or maintain their input depending on task. We argue that this task-dependent tuning, which we call "dynamic adaptive computation", presents a central organization principle of cortical networks and discuss first experimental evidence.

          Related collections

          Most cited references40

          • Record: found
          • Abstract: found
          • Article: not found

          Real-time computing without stable states: a new framework for neural computation based on perturbations.

          A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: not found

            Probabilistic decision making by slow reverberation in cortical circuits.

            Recent physiological studies of alert primates have revealed cortical neural correlates of key steps in a perceptual decision-making process. To elucidate synaptic mechanisms of decision making, I investigated a biophysically realistic cortical network model for a visual discrimination experiment. In the model, slow recurrent excitation and feedback inhibition produce attractor dynamics that amplify the difference between conflicting inputs and generates a binary choice. The model is shown to account for salient characteristics of the observed decision-correlated neural activity, as well as the animal's psychometric function and reaction times. These results suggest that recurrent excitation mediated by NMDA receptors provides a candidate cellular mechanism for the slow time integration of sensory stimuli and the formation of categorical choices in a decision-making neocortical network.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Internal brain state regulates membrane potential synchrony in barrel cortex of behaving mice.

              Internal brain states form key determinants for sensory perception, sensorimotor coordination and learning. A prominent reflection of different brain states in the mammalian central nervous system is the presence of distinct patterns of cortical synchrony, as revealed by extracellular recordings of the electroencephalogram, local field potential and action potentials. Such temporal correlations of cortical activity are thought to be fundamental mechanisms of neuronal computation. However, it is unknown how cortical synchrony is reflected in the intracellular membrane potential (V(m)) dynamics of behaving animals. Here we show, using dual whole-cell recordings from layer 2/3 primary somatosensory barrel cortex in behaving mice, that the V(m) of nearby neurons is highly correlated during quiet wakefulness. However, when the mouse is whisking, an internally generated state change reduces the V(m) correlation, resulting in a desynchronized local field potential and electroencephalogram. Action potential activity was sparse during both quiet wakefulness and active whisking. Single action potentials were driven by a large, brief and specific excitatory input that was not present in the V(m) of neighbouring cells. Action potential initiation occurs with a higher signal-to-noise ratio during active whisking than during quiet periods. Therefore, we show that an internal brain state dynamically regulates cortical membrane potential synchrony during behaviour and defines different modes of cortical processing.
                Bookmark

                Author and article information

                Journal
                20 September 2018
                Article
                1809.07550
                ab55fb52-987e-4d92-b65b-83761bd7181d

                http://creativecommons.org/licenses/by-sa/4.0/

                History
                Custom metadata
                6 pages + references, 2 figures
                q-bio.NC nlin.AO

                Neurosciences,Nonlinear & Complex systems
                Neurosciences, Nonlinear & Complex systems

                Comments

                Comment on this article