Cumulant

In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments.

An advantage of H(t)—in some sense the function K(t) evaluated for purely imaginary arguments—is that E[eitX] is well defined for all real values of t even when E[etX] is not well defined for all real values of t, such as can occur when there is "too much" probability that X has a large magnitude.

Nevertheless, even when H(t) does not have a long Maclaurin series, it can be used directly in analyzing and, particularly, adding random variables.

the above probability distributions get a unified formula for the derivative of the cumulant generating function:[citation needed]

The cumulant generating function K(t), if it exists, is infinitely differentiable and convex, and passes through the origin.

Its first derivative ranges monotonically in the open interval from the infimum to the supremum of the support of the probability distribution, and its second derivative is strictly positive everywhere it is defined, except for the degenerate distribution of a single point mass.

The cumulant-generating function exists if and only if the tails of the distribution are majorized by an exponential decay, that is, (see Big O notation)

For a degenerate point mass at c, the cumulant generating function is the straight line

[6]) The natural exponential family of a distribution may be realized by shifting or translating K(t), and adjusting it vertically so that it always passes through the origin: if f is the pdf with cumulant generating function

[7] The underlying result here is that the cumulant generating function cannot be a finite-order polynomial of degree greater than 2.

The moments can be recovered in terms of cumulants by evaluating the nth derivative of

Likewise, the cumulants can be recovered in terms of moments by evaluating the nth derivative of

The explicit expression for the nth moment in terms of the first n cumulants, and vice versa, can be obtained by using Faà di Bruno's formula for higher derivatives of composite functions.

To express the central moments as functions of the cumulants, just drop from these polynomials all terms in which κ1 appears as a factor:

Similarly, the nth cumulant κn is an nth-degree polynomial in the first n non-central moments.

To express the cumulants κn for n > 1 as functions of the central moments, drop from these polynomials all terms in which μ'1 appears as a factor:

The cumulants can be related to the moments by differentiating the relationship log M(t) = K(t) with respect to t, giving M′(t) = K′(t) M(t), which conveniently contains no exponentials or logarithms.

These polynomials have a remarkable combinatorial interpretation: the coefficients count certain partitions of sets.

Further connection between cumulants and combinatorics can be found in the work of Gian-Carlo Rota, where links to invariant theory, symmetric functions, and binomial sequences are studied via umbral calculus.

The joint cumulant of random variables can be expressed as an alternate sum of products of their mixed moments, see Equation (3.2.7) in,[11]

{\displaystyle \kappa (X,Y,Z,W)=\operatorname {E} (XYZW)-\operatorname {E} (XY)\operatorname {E} (ZW)-\operatorname {E} (XZ)\operatorname {E} (YW)-\operatorname {E} (XW)\operatorname {E} (YZ).\,}

More generally, any coefficient of the Maclaurin series can also be expressed in terms of mixed moments, although there are no concise formulae.

Indeed, as noted above, one can write it as a joint cumulant by repeating random variables appropriately, and then apply the above formula to express it in terms of mixed moments.

[citation needed] The combinatorial meaning of the expression of mixed moments in terms of cumulants is easier to understand than that of cumulants in terms of mixed moments, see Equation (3.2.6) in:[11]

A system in equilibrium with a thermal bath at temperature T have a fluctuating internal energy E, which can be considered a random variable drawn from a distribution

Other free energy can be a function of other variables such as the magnetic field or chemical potential

[19] Stephen Stigler has said[citation needed] that the name cumulant was suggested to Fisher in a letter from Harold Hotelling.

[20] The partition function in statistical physics was introduced by Josiah Willard Gibbs in 1901.

[citation needed] More generally, the cumulants of a sequence { mn : n = 1, 2, 3, ... }, not necessarily the moments of any probability distribution, are, by definition,

where the values of κn for n = 1, 2, 3, ... are found formally, i.e., by algebra alone, in disregard of questions of whether any series converges.