Joint probability distribution

The joint distribution can just as well be considered for any given number of random variables.

In the special case of continuous random variables, it is sufficient to consider probability density functions, and in the case of discrete random variables, it is sufficient to consider probability mass functions.

Moreover, the final row and the final column give the marginal probability distribution for A and the marginal probability distribution for B respectively.

be discrete random variables associated with the outcomes of the first and second coin flips respectively.

All possible outcomes are Since each outcome is equally likely the joint probability mass function becomes Since the coin flips are independent, the joint probability mass function is the product of the marginals: Consider the roll of a fair die and let

If more than one random variable is defined in a random experiment, it is important to distinguish between the joint probability distribution of X and Y and the probability distribution of each variable individually.

If the joint probability density function of random variable X and Y is

is given by[2]: p. 89 where the right-hand side represents the probability that the random variable

yields a shorter notation: The joint probability mass function of two discrete random variables

The generalization of the preceding two-variable case is the joint probability distribution of

which is: or equivalently This identity is known as the chain rule of probability.

for two continuous random variables is defined as the derivative of the joint cumulative distribution function (see Eq.1): This is equal to: where

With one variable of each type One example of a situation in which one may wish to find the cumulative distribution of one random variable which is continuous and another random variable which is discrete arises when one wishes to use a logistic regression in predicting the probability of a binary outcome Y conditional on the value of a continuously distributed outcome

One must use the "mixed" joint density when finding the cumulative distribution of this binary outcome because the input variables

Either of these two decompositions can then be used to recover the joint cumulative distribution function: The definition generalizes to a mixture of arbitrary numbers of discrete and continuous random variables.

are independent if and only if the joint cumulative distribution function satisfies Two discrete random variables

are independent if and only if the joint probability mass function satisfies for all

While the number of independent random events grows, the related joint probability value decreases rapidly to zero, according to a negative exponential law.

Similarly, two absolutely continuous random variables are independent if and only if for all

of these variables, then the probability mass function of the joint distribution is

Therefore, it can be efficiently represented by the lower-dimensional probability distributions

Such conditional independence relations can be represented with a Bayesian network or copula functions.

A common measure of the relationship between two random variables is the covariance.

Covariance is a measure of linear relationship between the random variables.

is[3] There is another measure of the relationship between two random variables that is often easier to interpret than the covariance.

The correlation just scales the covariance by the product of the standard deviation of each variable.

Consequently, the correlation is a dimensionless quantity that can be used to compare the linear relationships between pairs of variables in different units.

If the points in the joint probability distribution of X and Y that receive positive probability tend to fall along a line of positive (or negative) slope, ρXY is near +1 (or −1).

If ρXY equals +1 or −1, it can be shown that the points in the joint probability distribution that receive positive probability fall exactly along a straight line.

Similar to covariance, the correlation is a measure of the linear relationship between random variables.