23
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Information Decomposition in Bivariate Systems: Theory and Application to Cardiorespiratory Dynamics

      , ,
      Entropy
      MDPI AG

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Related collections

          Most cited references41

          • Record: found
          • Abstract: not found
          • Article: not found

          Testing for causality

            Bookmark
            • Record: found
            • Abstract: found
            • Article: found
            Is Open Access

            Estimating Mutual Information

            We present two classes of improved estimators for mutual information \(M(X,Y)\), from samples of random points distributed according to some joint probability density \(\mu(x,y)\). In contrast to conventional estimators based on binnings, they are based on entropy estimates from \(k\)-nearest neighbour distances. This means that they are data efficient (with \(k=1\) we resolve structures down to the smallest possible scales), adaptive (the resolution is higher where data are more numerous), and have minimal bias. Indeed, the bias of the underlying entropy estimates is mainly due to non-uniformity of the density at the smallest resolved scale, giving typically systematic errors which scale as functions of \(k/N\) for \(N\) points. Numerically, we find that both families become {\it exact} for independent distributions, i.e. the estimator \(\hat M(X,Y)\) vanishes (up to statistical fluctuations) if \(\mu(x,y) = \mu(x) \mu(y)\). This holds for all tested marginal distributions and for all dimensions of \(x\) and \(y\). In addition, we give estimators for redundancies between more than 2 random variables. We compare our algorithms in detail with existing algorithms. Finally, we demonstrate the usefulness of our estimators for assessing the actual independence of components obtained from independent component analysis (ICA), for improving ICA, and for estimating the reliability of blind source separation.
              Bookmark
              • Record: found
              • Abstract: found
              • Article: not found

              Effective connectivity: Influence, causality and biophysical modeling

              This is the final paper in a Comments and Controversies series dedicated to “The identification of interacting networks in the brain using fMRI: Model selection, causality and deconvolution”. We argue that discovering effective connectivity depends critically on state-space models with biophysically informed observation and state equations. These models have to be endowed with priors on unknown parameters and afford checks for model Identifiability. We consider the similarities and differences among Dynamic Causal Modeling, Granger Causal Modeling and other approaches. We establish links between past and current statistical causal modeling, in terms of Bayesian dependency graphs and Wiener–Akaike–Granger–Schweder influence measures. We show that some of the challenges faced in this field have promising solutions and speculate on future developments.
                Bookmark

                Author and article information

                Journal
                ENTRFG
                Entropy
                Entropy
                MDPI AG
                1099-4300
                January 2015
                January 12 2015
                : 17
                : 1
                : 277-303
                Article
                10.3390/e17010277
                f2e71e71-0771-433b-8d64-4f14bd1b7639
                © 2015

                https://creativecommons.org/licenses/by/4.0/

                History

                Comments

                Comment on this article