作者: Chandrajit Bajaj , Dilin Wang , Qiang Liu , Ziyang Tang
DOI:
关键词:
摘要: Stein variational gradient descent (SVGD) is a particle-based inference algorithm that leverages information for efficient approximate inference. In this work, we enhance SVGD by leveraging preconditioning matrices, such as the Hessian and Fisher matrix, to incorporate geometric into updates. We achieve presenting generalization of replaces scalar-valued kernels in vanilla with more general matrix-valued kernels. This yields significant extension SVGD, importantly, allows us flexibly various matrices accelerate exploration probability landscape. Empirical results show our method outperforms variety baseline approaches over range real-world Bayesian tasks.