Expectation propagation

From testwiki
Jump to navigation Jump to search

Template:Short description

Expectation propagation (EP) is a technique in Bayesian machine learning.[1]

EP finds approximations to a probability distribution.[1] It uses an iterative approach that uses the factorization structure of the target distribution.[1] It differs from other Bayesian approximation approaches such as variational Bayesian methods.[1]

More specifically, suppose we wish to approximate an intractable probability distribution p(𝐱) with a tractable distribution q(𝐱). Expectation propagation achieves this approximation by minimizing the Kullback-Leibler divergence KL(p||q).[1] Variational Bayesian methods minimize KL(q||p) instead.[1]

If q(𝐱) is a Gaussian 𝒩(𝐱|μ,Σ), then KL(p||q) is minimized with μ and Σ being equal to the mean of p(𝐱) and the covariance of p(𝐱), respectively; this is called moment matching.[1]

Applications

Expectation propagation via moment matching plays a vital role in approximation for indicator functions that appear when deriving the message passing equations for TrueSkill.

References

Template:Reflist


Template:Compsci-stub