Method of moments (probability theory)

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.

[1] Suppose X is a random variable and that all of the moments exist.

If for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.

[2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.