In probability theory, the law of the iterated logarithm describes the magnitude of the fluctuations of a random walk.
[2] Let {Yn} be independent, identically distributed random variables with zero means and unit variances.
Then where "log" is the natural logarithm, "lim sup" denotes the limit superior, and "a.s." stands for "almost surely".
a.s. and then we have Note that, the first statement covers the case of the standard normal distribution, but the second does not.
There are two versions of the law of large numbers — the weak and the strong — and they both state that the sums Sn, scaled by n−1, converge to zero, respectively in probability and almost surely: On the other hand, the central limit theorem states that the sums Sn scaled by the factor n−1/2 converge in distribution to a standard normal distribution.
Then so An identical argument shows that This implies that these quantities cannot converge almost surely.
The law of the iterated logarithm provides the scaling factor where the two limits become different: Thus, although the absolute value of the quantity
The law of the iterated logarithm (LIL) for a sum of independent and identically distributed (i.i.d.)
random variables with zero mean and bounded increment dates back to Khinchin and Kolmogorov in the 1920s.
Since then, there has been a tremendous amount of work on the LIL for various kinds of dependent structures and for stochastic processes.
Hartman–Wintner (1940) generalized LIL to random walks with increments with zero mean and finite variance.
De Acosta (1983) gave a simple proof of the Hartman–Wintner version of the LIL.
[6] Chung (1948) proved another version of the law of the iterated logarithm for the absolute value of a brownian motion.
[9] Wittmann (1985) generalized Hartman–Wintner version of LIL to random walks satisfying milder conditions.
Yongge Wang (1996) showed that the law of the iterated logarithm holds for polynomial time pseudorandom sequences also.
Balsubramani (2014) proved a non-asymptotic LIL that holds over finite-time martingale sample paths.
[14] This subsumes the martingale LIL as it provides matching finite-sample concentration and anti-concentration bounds, and enables sequential testing[15] and other applications.