作者: Beyza Ermis , Cedric Archambeau
DOI:
关键词: Monotonic function 、 Set (abstract data type) 、 Inference 、 Latent Dirichlet allocation 、 Mathematical optimization 、 Distributed algorithm 、 Variational message passing 、 Mathematics 、 Local optimum 、 Stochastic approximation
摘要: We introduce incremental variational inference and apply it to latent Dirichlet allocation (LDA). Incremental is inspired by EM provides an alternative stochastic inference. LDA can process massive document collections, does not require set a learning rate, converges faster local optimum of the bound enjoys attractive property monotonically increasing it. study performance on large benchmark data sets. further approximation which extends asynchronous distributed setting. The resulting algorithm achieves comparable as single host inference, but with significant speed-up.