Inviting an author to review:
Find an author and click ‘Invite to review selected article’ near their name.
Search for authorsSearch for similar articles
4
views
0
recommends
+1 Recommend
0 collections
    0
    shares
      • Record: found
      • Abstract: found
      • Article: found
      Is Open Access

      Supervised machine learning to estimate instabilities in chaotic systems: estimation of local Lyapunov exponents

      Preprint
      , , , ,

      Read this article at

      Bookmark
          There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.

          Abstract

          In chaotic dynamical systems such as the weather, prediction errors grow faster in some situations than in others. Real-time knowledge about the error growth could enable strategies to adjust the modelling and forecasting infrastructure on-the-fly to increase accuracy and/or reduce computation time. For example one could change the spatio-temporal resolution of the numerical model, locally increase the data availability, etc. Local Lyapunov exponents are known indicators of the rate at which very small prediction errors grow over a finite time interval. However, their computation is very expensive: it requires maintaining and evolving a tangent linear model and orthogonalisation algorithms. In this study, we investigate the capability of supervised machine learning to estimate the imminent local Lyapunov exponents, from input of current and recent time steps of the system trajectory. Thus machine learning is not used here to emulate a physical model or some of its components, but "non intrusively" as a complementary tool. Specifically, we test the accuracy of four popular supervised learning algorithms: regression trees, multilayer perceptrons, convolutional neural networks and long short-term memory networks. Experiments are conducted on two low-dimensional chaotic systems of ordinary differential equations, the Lorenz 63 and the R\"ossler models. We find that on average the machine learning algorithms predict the stable local Lyapunov exponent accurately, the unstable exponent reasonably accurately, and the neutral exponent only somewhat accurately. We show that greater prediction accuracy is associated with local homogeneity of the local Lyapunov exponents on the system attractor. Importantly, the situations in which (forecast) errors grow fastest are not necessarily the same as those where it is more difficult to predict local Lyapunov exponents with machine learning.

          Related collections

          Author and article information

          Journal
          10 February 2022
          Article
          2202.04944
          38672197-59e1-488a-bb18-b34f26dcc8e6

          http://creativecommons.org/licenses/by/4.0/

          History
          Custom metadata
          33 pages, 9 figures
          physics.comp-ph nlin.CD

          Mathematical & Computational physics,Nonlinear & Complex systems
          Mathematical & Computational physics, Nonlinear & Complex systems

          Comments

          Comment on this article