degrees of freedom In probability theory and statistics, the noncentral chi-squared distribution (or noncentral chi-square distribution, noncentral
It often arises in the power analysis of statistical tests in which the null distribution is (perhaps asymptotically) a chi-squared distribution; important examples of such tests are the likelihood-ratio tests.
be k independent, normally distributed random variables with means
which is related to the mean of the random variables
in other ways, such as half of the above sum, or its square root.
This distribution arises in multivariate statistics as a derivative of the multivariate normal distribution.
While the central chi-squared distribution is the squared norm of a random vector with
distribution (i.e., the squared distance from the origin to a point taken at random from that distribution), the non-central
is the squared norm of a random vector with
is the identity matrix of size k. The probability density function (pdf) is given by where
From this representation, the noncentral chi-squared distribution is seen to be a Poisson-weighted mixture of central chi-squared distributions.
Suppose that a random variable J has a Poisson distribution with mean
, and the conditional distribution of Z given J = i is chi-squared with k + 2i degrees of freedom.
Then the unconditional distribution of Z is non-central chi-squared with k degrees of freedom, and non-centrality parameter
Alternatively, the pdf can be written as where
is a modified Bessel function of the first kind given by Using the relation between Bessel functions and hypergeometric functions, the pdf can also be written as:[2] The case k = 0 (zero degrees of freedom), in which case the distribution has a discrete component at zero, is discussed by Torgersen (1972) and further by Siegel (1979).
[3][4] The derivation of the probability density function is most easily done by performing the following steps: The moment-generating function is given by The first few raw moments are: The first few central moments are: The nth cumulant is Hence Again using the relation between the central and noncentral chi-squared distributions, the cumulative distribution function (cdf) can be written as where
is the cumulative distribution function of the central chi-squared distribution with k degrees of freedom which is given by The Marcum Q-function
[5] When the degrees of freedom k is positive odd integer, we have a closed form expression for the complementary cumulative distribution function given by[6] where n is non-negative integer, Q is the Gaussian Q-function, and I is the modified Bessel function of first kind with half-integer order.
The modified Bessel function of first kind with half-integer order in itself can be represented as a finite sum in terms of hyperbolic functions.
Sankaran discusses a number of closed form approximations for the cumulative distribution function.
[8] In an earlier paper, he derived and states the following approximation:[9] where This and other approximations are discussed in a later text book.
[10] More recently, since the CDF of non-central chi-squared distribution with odd degree of freedom can be exactly computed, the CDF for even degree of freedom can be approximated by exploiting the monotonicity and log-concavity properties of Marcum-Q function as Another approximation that also serves as an upper bound is given by For a given probability, these formulas are easily inverted to provide the corresponding approximation for
Sankaran (1963) discusses the transformations of the form
He analyzes the expansions of the cumulants of
produce reasonable results: Also, a simpler transformation
can be used as a variance stabilizing transformation that produces a random variable with mean
Usability of these transformations may be hampered by the need to take the square roots of negative numbers.
Two-sided normal regression tolerance intervals can be obtained based on the noncentral chi-squared distribution.
[12] This enables the calculation of a statistical interval within which, with some confidence level, a specified proportion of a sampled population falls.