Expectation propagation

Expectation propagation (EP) is a technique in Bayesian machine learning.

[1] EP finds approximations to a probability distribution.

[1] It uses an iterative approach that uses the factorization structure of the target distribution.

[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.

[1] More specifically, suppose we wish to approximate an intractable probability distribution

with a tractable distribution

Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence

[1] Variational Bayesian methods minimize

is minimized with

, respectively; this is called moment matching.

[1] Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

This computer science article is a stub.

You can help Wikipedia by expanding it.