Covariance Kernels from Bayesian Generative Models

作者: Matthias Seeger

DOI:

关键词:

摘要: We propose the framework of mutual information kernels for learning covariance kernels, as used in Support Vector machines and Gaussian process classifiers, from unlabeled task data using Bayesian techniques. describe an implementation this which uses variational mixtures factor analyzers order to attack classification problems high-dimensional spaces where labeled is sparse, but abundant.

参考文章(27)
John C. Platt, Fast training of support vector machines using sequential minimal optimization Advances in kernel methods. pp. 185- 208 ,(1999)
Michael I Jordan, Zoubin Ghahramani, Tommi S Jaakkola, Lawrence K Saul, None, An introduction to variational methods for graphical models Machine Learning. ,vol. 37, pp. 105- 161 ,(1999) , 10.1023/A:1007665907178
D. Haussler, Convolution kernels on discrete structures Tech. Rep.. ,(1999)
Thorsten Joachims, Making large scale SVM learning practical Technical reports. ,(1999) , 10.17877/DE290R-14262
Alex J. Smola, Bernhard Schölkopf, Klaus-Robert Müller, The connection between regularization operators and support vector kernels Neural Networks. ,vol. 11, pp. 637- 649 ,(1998) , 10.1016/S0893-6080(98)00032-X
Avrim Blum, Tom Mitchell, None, Combining labeled and unlabeled data with co-training conference on learning theory. pp. 92- 100 ,(1998) , 10.1145/279943.279962
M.E. Tipping, Deriving cluster analytic distance functions from Gaussian mixture models 9th International Conference on Artificial Neural Networks: ICANN '99. ,vol. 2, pp. 815- 820 ,(1999) , 10.1049/CP:19991212