Divergence measures and a general framework for local variational approximation.

Neural Netw

Graduate School of Information Science, Nara Institute of Science and Technology, Takayama-cho, Ikoma, Nara, Japan.

Published: December 2011

The local variational method is a technique to approximate an intractable posterior distribution in Bayesian learning. This article formulates a general framework for local variational approximation and shows that its objective function is decomposable into the sum of the Kullback information and the expected Bregman divergence from the approximating posterior distribution to the Bayesian posterior distribution. Based on a geometrical argument in the space of approximating posteriors, we propose an efficient method to evaluate an upper bound of the marginal likelihood. Moreover, we demonstrate that the variational Bayesian approach for the latent variable models can be viewed as a special case of this general framework.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2011.06.004DOI Listing

Publication Analysis

Top Keywords

general framework
12
local variational
12
posterior distribution
12
framework local
8
variational approximation
8
distribution bayesian
8
divergence measures
4
measures general
4
variational
4
approximation local
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!