Incremental Variational Inference for Latent Dirichlet Allocation

作者: Beyza Ermis , Cedric Archambeau

DOI:

关键词: Monotonic functionSet (abstract data type)InferenceLatent Dirichlet allocationMathematical optimizationDistributed algorithmVariational message passingMathematicsLocal optimumStochastic approximation

摘要: We introduce incremental variational inference and apply it to latent Dirichlet allocation (LDA). Incremental is inspired by EM provides an alternative stochastic inference. LDA can process massive document collections, does not require set a learning rate, converges faster local optimum of the bound enjoys attractive property monotonically increasing it. study performance on large benchmark data sets. further approximation which extends asynchronous distributed setting. The resulting algorithm achieves comparable as single host inference, but with significant speed-up.

参考文章(22)
Christopher M. Bishop, Pattern Recognition and Machine Learning ,(2006)
David M Blei, Andrew Y Ng, Michael I Jordan, None, Latent dirichlet allocation Journal of Machine Learning Research. ,vol. 3, pp. 993- 1022 ,(2003) , 10.5555/944919.944937
Ke Zhai, Jordan Boyd-Graber, Sebastian Nima Bruch, Mohamad L Alkhouja, Mr. LDA: a flexible large scale topic modeling package using variational inference in MapReduce the web conference. pp. 879- 888 ,(2012) , 10.1145/2187836.2187955
Alexander Smola, Shravan Narayanamurthy, An architecture for parallel topic models Proceedings of the VLDB Endowment. ,vol. 3, pp. 703- 710 ,(2010) , 10.14778/1920841.1920931
Nicolas L. Roux, Francis R. Bach, Mark Schmidt, A Stochastic Gradient Method with an Exponential Convergence _Rate for Finite Training Sets neural information processing systems. ,vol. 25, pp. 2663- 2671 ,(2012)
Max Welling, Arthur U. Asuncion, Padhraic Smyth, Asynchronous Distributed Learning of Topic Models neural information processing systems. ,vol. 21, pp. 81- 88 ,(2008)
Jason Wolfe, Aria Haghighi, Dan Klein, Fully distributed EM for very large datasets Proceedings of the 25th international conference on Machine learning - ICML '08. pp. 1184- 1191 ,(2008) , 10.1145/1390156.1390305
Matthew James Beal, Variational Algorithms for Approximate Bayesian Inference PhD. Thesis, Gatsby Computational Neuroscience Unit, University College London. ,(2003)