作者: Edward Snelson , Zoubin Ghahramani , None
DOI:
关键词: Computer science 、 Local regression 、 Nonparametric statistics 、 Gaussian random field 、 Gaussian filter 、 Artificial intelligence 、 Machine learning 、 Probabilistic logic 、 Gaussian process 、 Theoretical computer science 、 Small set 、 Gaussian function
摘要: Gaussian process (GP) models are flexible probabilistic nonparametric for regression, classification and other tasks. Unfortunately they suffer from computational intractability large data sets. Over the past decade there have been many different approximations developed to reduce this cost. Most of these can be termed global approximations, in that try summarize all training via a small set support points. A approach is local where experts account their own part space. In paper we start by investigating regimes which approaches work well or fail. We then proceed develop new sparse GP approximation combination both approaches. Theoretically show it derived as natural extension framework Quinonero Candela Rasmussen [2005] approximations. demonstrate benefits combined on some 1D examples illustration, real-world