8
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Colloquium: Criticality and dynamical scaling in living systems

      Preprint

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          A celebrated and controversial hypothesis conjectures that some biological systems --parts, aspects, or groups of them-- may extract important functional benefits from operating at the edge of instability, halfway between order and disorder, i.e. in the vicinity of the critical point of a phase transition. Criticality has been argued to provide biological systems with an optimal balance between robustness against perturbations and flexibility to adapt to changing conditions, as well as to confer on them optimal computational capabilities, huge dynamical repertoires, unparalleled sensitivity to stimuli, etc. Criticality, with its concomitant scale invariance, can be conjectured to emerge in living systems as the result of adaptive and evolutionary processes that, for reasons to be fully elucidated, select for it as a template upon which higher layers of complexity can rest. This hypothesis is very suggestive as it proposes that criticality could constitute a general and common organizing strategy in biology stemming from the physics of phase transitions. However, despite its thrilling implications, this is still in its embryonic state as a well-founded theory and, as such, it has elicited some healthy skepticism. From the experimental side, the advent of high-throughput technologies has created new prospects in the exploration of biological systems, and empirical evidence in favor of criticality has proliferated, with examples ranging from endogenous brain activity and gene-expression patterns, to flocks of birds and insect-colony foraging, to name but a few...

          Related collections

          Most cited references134

          • Record: found
          • Abstract: found
          • Article: found
          Is Open Access

          The structure and function of complex networks

          M. Newman (2003)
          Inspired by empirical studies of networked systems such as the Internet, social networks, and biological networks, researchers have in recent years developed a variety of techniques and models to help us understand or predict the behavior of these systems. Here we review developments in this field, including such concepts as the small-world effect, degree distributions, clustering, network correlations, random graph models, models of network growth and preferential attachment, and dynamical processes taking place on networks.
            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Power-law distributions in empirical data

            Power-law distributions occur in many situations of scientific interest and have significant consequences for our understanding of natural and man-made phenomena. Unfortunately, the detection and characterization of power laws is complicated by the large fluctuations that occur in the tail of the distribution -- the part of the distribution representing large but rare events -- and by the difficulty of identifying the range over which power-law behavior holds. Commonly used methods for analyzing power-law data, such as least-squares fitting, can produce substantially inaccurate estimates of parameters for power-law distributions, and even in cases where such methods return accurate answers they are still unsatisfactory because they give no indication of whether the data obey a power law at all. Here we present a principled statistical framework for discerning and quantifying power-law behavior in empirical data. Our approach combines maximum-likelihood fitting methods with goodness-of-fit tests based on the Kolmogorov-Smirnov statistic and likelihood ratios. We evaluate the effectiveness of the approach with tests on synthetic data and give critical comparisons to previous approaches. We also apply the proposed methods to twenty-four real-world data sets from a range of different disciplines, each of which has been conjectured to follow a power-law distribution. In some cases we find these conjectures to be consistent with the data while in others the power law is ruled out.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Real-time computing without stable states: a new framework for neural computation based on perturbations.

              A key challenge for neural modeling is to explain how a continuous stream of multimodal input from a rapidly changing environment can be processed by stereotypical recurrent circuits of integrate-and-fire neurons in real time. We propose a new computational model for real-time computing on time-varying input that provides an alternative to paradigms based on Turing machines or attractor neural networks. It does not require a task-dependent construction of neural circuits. Instead, it is based on principles of high-dimensional dynamical systems in combination with statistical learning theory and can be implemented on generic evolved or found recurrent circuitry. It is shown that the inherent transient dynamics of the high-dimensional dynamical system formed by a sufficiently large and heterogeneous neural circuit may serve as universal analog fading memory. Readout neurons can learn to extract in real time from the current state of such recurrent neural circuit information about current and past inputs that may be needed for diverse tasks. Stable internal states are not required for giving a stable output, since transient internal states can be transformed by readout neurons into stable target outputs due to the high dimensionality of the dynamical system. Our approach is based on a rigorous computational model, the liquid state machine, that, unlike Turing machines, does not require sequential transitions between well-defined discrete internal states. It is supported, as the Turing machine is, by rigorous mathematical results that predict universal computational power under idealized conditions, but for the biologically more realistic scenario of real-time processing of time-varying inputs. Our approach provides new perspectives for the interpretation of neural coding, the design of experiments and data analysis in neurophysiology, and the solution of problems in robotics and neurotechnology.
                Bookmark

                Author and article information

                Journal
                12 December 2017
                Article
                1712.04499
                7b3906fe-a890-4ba4-8548-e57203147deb

                http://arxiv.org/licenses/nonexclusive-distrib/1.0/

                History
                Custom metadata
                7 Figures. Paper submitted
                cond-mat.stat-mech nlin.AO physics.bio-ph q-bio.QM

                Comments

                Comment on this article