Applications include population dynamics,[1][2][3][4][5][6] ecosystem service,[7] medicine,[8] neuroscience,[9][10][11] dynamical systems,[12][13][14] geophysics,[15][16][17] and human-computer interaction.
[18] EDM was originally developed by Robert May and George Sugihara.
It can be considered a methodology for data modeling, predictive analytics, dynamical system analysis, machine learning and time series analysis.
Mathematical models have tremendous power to describe observations of real-world systems.
They are routinely used to test hypothesis, explain mechanisms and predict future outcomes.
However, real-world systems are often nonlinear and multidimensional, in some instances rendering explicit equation-based modeling problematic.
Empirical models, which infer patterns and associations from the data instead of using hypothesized equations, represent a natural and flexible framework for modeling complex dynamics.
Donald DeAngelis and Simeon Yurek illustrated that canonical statistical models are ill-posed when applied to nonlinear dynamical systems.
EDM operates in this space, the multidimensional state-space of system dynamics rather than on one-dimensional observational time series.
EDM is thus a state-space, nearest-neighbors paradigm where system dynamics are inferred from states derived from observational time series.
This provides a model-free representation of the system naturally encompassing nonlinear dynamics.
A cornerstone of EDM is recognition that time series observed from a dynamical system can be transformed into higher-dimensional state-spaces by time-delay embedding with Takens's theorem.
As of 2022, the main algorithms are Simplex projection,[20] Sequential locally weighted global linear maps (S-Map) projection,[21] Multivariate embedding in Simplex or S-Map,[1] Convergent cross mapping (CCM),[22] and Multiview Embeding,[23] described below.
nearest neighbors to the location in the state-space from which a prediction is desired.
The prediction is computed as the average of the weighted phase-space simplex projected
Each neighbor is weighted proportional to their distance to the projection origin vector in the state-space.
S-Map[21] extends the state-space prediction in Simplex from an average of the
, neighbors close to the prediction origin point have a higher weight than those further from it, such that a local linear approximation to the nonlinear system is reasonable.
This localisation ability allows one to identify an optimal local scale, in-effect quantifying the degree of state dependence, and hence nonlinearity of the system.
Another feature of S-Map is that for a properly fit model, the regression coefficients between variables have been shown to approximate the gradient (directional derivative) of variables along the manifold.
[27] These Jacobians represent the time-varying interaction strengths between system variables.
belong to the same dynamical system, their reconstructions (via embeddings)
CCM leverages this property to infer causality by predicting
library of points (or vice versa for the other direction of causality), while assessing improvements in cross map predictability as larger and larger random samplings of
Multiview Embedding[23] is a Dimensionality reduction technique where a large number of state-space time series vectors are combitorially assessed towards maximal model predictability.