作者: Tim Salimans , Max Welling , Diederik Kingma
DOI:
关键词: Applied mathematics 、 Markov chain Monte Carlo 、 Monte Carlo method 、 Computation 、 Maximization 、 Variational message passing 、 Bayesian inference 、 Mathematical optimization 、 Mathematics 、 Random variable 、 Inference
摘要: Recent advances in stochastic gradient variational inference have made it possible to perform Bayesian with posterior approximations containing auxiliary random variables. This enables us explore a new synthesis of and Monte Carlo methods where we incorporate one or more steps MCMC into our approximation. By doing so obtain rich class algorithms bridging the gap between MCMC, offering best both worlds: fast approximation through maximization an explicit objective, option trading off additional computation for accuracy. We describe theoretical foundations that make this show some promising first results.