Material Detail

Perturbative Corrections to Expectation Consistent Approximate Inference

Perturbative Corrections to Expectation Consistent Approximate Inference

This video was recorded at NIPS Workshop on Approximate Bayesian Inference in Continuous/Hybrid Models, Whistler 2007. Algorithms for approximate inference usually come without any guarantee for the quality of the approximation. Nevertheless, we often find cases where such algorithms perform extremely well on the computation of posterior moments when compared to time consuming (and in the limit exact) MC simulations or exact enumerations. A prominent example is the Expectation Propagation (EP) algorithm when applied to Gaussian process classification. Can we understand when and why we can trust the approximate results or, if not, how we could obtain systematic improvements? In this talk, we rederive the fixed point conditions of EP using the ideas of expectation consistency (EC) [1] and explicitly consider the terms neglected in the approximation. We will show how one can derive a formal (asymptotic) power series expansion for this correction and compute its leading terms. We will illustrate the approach for the case of GP classification and for networks of Ising variables. [1] Expectation Consistent Approximate Inference, Manfred Opper and Ole Winther, JMLR 6, 2177 - 2204 (2005).

Quality

  • User Rating
  • Comments
  • Learning Exercises
  • Bookmark Collections
  • Course ePortfolios
  • Accessibility Info

More about this material

Browse...

Disciplines with similar materials as Perturbative Corrections to Expectation Consistent Approximate Inference

Comments

Log in to participate in the discussions or sign up if you are not already a MERLOT member.