In mathematics, the Lyapunov exponent or Lyapunov characteristic exponent of a dynamical system is a quantity that characterizes the rate of separation of infinitesimally close trajectories.
Quantitatively, two trajectories in phase space with initial separation vector
Thus, there is a spectrum of Lyapunov exponents—equal in number to the dimensionality of the phase space.
It is common to refer to the largest one as the maximal Lyapunov exponent (MLE), because it determines a notion of predictability for a dynamical system.
A positive MLE is usually taken as an indication that the system is chaotic (provided some other conditions are met, e.g., phase space compactness).
Note that an arbitrary initial separation vector will typically contain some component in the direction associated with the MLE, and because of the exponential growth rate, the effect of the other exponents will be obliterated over time.
[1] For discrete time system (maps or fixed point iterations)
The choice of starting point may determine which attractor the system ends up on, if there is more than one.
The set of Lyapunov exponents will be the same for almost all starting points of an ergodic component of the dynamical system.
To introduce Lyapunov exponent consider a fundamental matrix
consisting of the linearly-independent solutions of the first-order approximation of the system.
Also, it is possible to construct a reverse example in which the first approximation has positive Lyapunov exponents along a zero solution of the original system but, at the same time, this zero solution of original nonlinear system is Lyapunov stable.
[3][4] The effect of sign inversion of Lyapunov exponents of solutions of the original system and the system of first approximation with the same initial data was subsequently called the Perron effect.
[3][4] Perron's counterexample shows that a negative largest Lyapunov exponent does not, in general, indicate stability, and that a positive largest Lyapunov exponent does not, in general, indicate chaos.
[4] If the system is conservative (i.e., there is no dissipation), a volume element of the phase space will stay the same along a trajectory.
If the system is dissipative, the sum of Lyapunov exponents is negative.
represents an upper bound for the information dimension of the system.
[6] Moreover, the sum of all the positive Lyapunov exponents gives an estimate of the Kolmogorov–Sinai entropy accordingly to Pesin's theorem.
[7] Along with widely used numerical methods for estimating and computing the Lyapunov dimension there is an effective analytical approach, which is based on the direct Lyapunov method with special Lyapunov-like functions.
Generally the calculation of Lyapunov exponents, as defined above, cannot be carried out analytically, and in most cases one must resort to numerical techniques.
An early example, which also constituted the first demonstration of the exponential divergence of chaotic trajectories, was carried out by R. H. Miller in 1964.
matrix based on averaging several finite time approximations of the limit defining
One of the most used and effective numerical techniques to calculate the Lyapunov spectrum for a smooth dynamical system relies on periodic Gram–Schmidt orthonormalization of the Lyapunov vectors to avoid a misalignment of all the vectors along the direction of maximal expansion.
[16][17][18] For the calculation of Lyapunov exponents from limited experimental data, various methods have been proposed.
The main difficulty is that the data does not fully explore the phase space, rather it is confined to the attractor which has very limited (if any) extension along certain directions.
These thinner or more singular directions within the data set are the ones associated with the more negative exponents.
The use of nonlinear mappings to model the evolution of small displacements from the attractor has been shown to dramatically improve the ability to recover the Lyapunov spectrum,[19][20] provided the data has a very low level of noise.
The singular nature of the data and its connection to the more negative exponents has also been explored.
[21] Whereas the (global) Lyapunov exponent gives a measure for the total predictability of a system, it is sometimes of interest to estimate the local predictability around a point x0 in phase space.
[22] Local exponents are not invariant under a nonlinear change of coordinates.