In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable.
In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.
In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X.
If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that Then the density function of Y is found as the derivative of the cumulative distribution function: The reciprocal distribution has a density function of the form[1] where
If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b−1 ,a−1), and the probability density function in this range is and is zero elsewhere.
The cumulative distribution function of the reciprocal, within the same range, is For example, if X is uniformly distributed on the interval (0,1), then Y = 1 / X has density
Let X be a t distributed random variate with k degrees of freedom.
, then Y = 1/X follows a reciprocal standard normal distribution, heavy-tailed and bimodal,[2] with modes at
and density and the first and higher-order moments do not exist.
[2] For such inverse distributions and for ratio distributions, there can still be defined probabilities for intervals, which can be computed either by Monte Carlo simulation or, in some cases, by using the Geary–Hinkley transformation.
[3] However, in the more general case of a shifted reciprocal function
following a general normal distribution, then mean and variance statistics do exist in a principal value sense, if the difference between the pole
The mean of this transformed random variable (reciprocal shifted normal distribution) is then indeed the scaled Dawson's function:[4] In contrast, if the shift
is purely complex, the mean exists and is a scaled Faddeeva function, whose exact expression depends on the sign of the imaginary part,
In both cases, the variance is a simple function of the mean.
[5] Therefore, the variance has to be considered in a principal value sense if
is real, while it exists if the imaginary part of
Note that these means and variances are exact, as they do not recur to linearisation of the ratio.
The exact covariance of two ratios with a pair of different poles
[6] The case of the inverse of a complex normal variable
, shifted or not, exhibits different characteristics.
is an exponentially distributed random variable with rate parameter
Note that the expected value of this random variable does not exist.
The reciprocal exponential distribution finds use in the analysis of fading wireless communication systems.
If X is a Cauchy distributed (μ, σ) random variable, then 1 / X is a Cauchy ( μ / C, σ / C ) random variable where C = μ2 + σ2.
number of trials and a probability of success
then no closed form for the reciprocal distribution is known.
An asymptotic approximation for the non-central moments of the reciprocal distribution is known.
For a triangular distribution with lower limit a, upper limit b and mode c, where a < b and a ≤ c ≤ b, the mean of the reciprocal is given by
Both moments of the reciprocal are only defined when the triangle does not cross zero, i.e. when a, b, and c are either all positive or all negative.