A Complete Recipe for Stochastic Gradient MCMC

作者: Tianqi Chen , Emily B. Fox , Yi-An Ma

DOI:

关键词: Riemann hypothesisMathematical optimizationGradient noiseHybrid Monte CarloMarkov processMathematicsLeverage (statistics)Applied mathematicsScalabilityContinuous-time stochastic processMarkov chain Monte Carlo

摘要: Many recent Markov chain Monte Carlo (MCMC) samplers leverage continuous dynamics to define a transition kernel that efficiently explores a target distribution. In tandem, a focus has …

参考文章(20)
Max Welling, Babak Shahbaba, Sungjin Ahn, Distributed Stochastic Gradient MCMC international conference on machine learning. ,vol. 32, pp. 1044- 1052 ,(2014)
Robert Zwanzig, Nonequilibrium statistical mechanics Oxford University Press. ,(2001)
Mark Girolami, Ben Calderhead, Riemann manifold Langevin and Hamiltonian Monte Carlo methods Journal of The Royal Statistical Society Series B-statistical Methodology. ,vol. 73, pp. 123- 214 ,(2011) , 10.1111/J.1467-9868.2010.00765.X
Chris Holmes, Arnaud Doucet, R mi Bardenet, Towards scaling up Markov chain Monte Carlo: an adaptive subsampling approach international conference on machine learning. ,vol. 1, pp. 405- 413 ,(2014)
Radford M. Neal, MCMC Using Hamiltonian Dynamics arXiv: Computation. pp. 139- 188 ,(2011) , 10.1201/B10905-10
Michael Betancourt, The Fundamental Incompatibility of Scalable Hamiltonian Monte Carlo and Naive Data Subsampling international conference on machine learning. pp. 533- 540 ,(2015)
David M Blei, Andrew Y Ng, Michael I Jordan, None, Latent dirichlet allocation Journal of Machine Learning Research. ,vol. 3, pp. 993- 1022 ,(2003) , 10.5555/944919.944937
Herbert Robbins, Sutton Monro, A Stochastic Approximation Method Annals of Mathematical Statistics. ,vol. 22, pp. 400- 407 ,(1951) , 10.1214/AOMS/1177729586
Simon Duane, A.D. Kennedy, Brian J. Pendleton, Duncan Roweth, Hybrid Monte Carlo Physics Letters B. ,vol. 195, pp. 216- 222 ,(1987) , 10.1016/0370-2693(87)91197-X