A Unified Loss Function in Bayesian Framework for Support Vector Regression

作者: Wei Chu , Chong Jin Ong , S. Sathiya Keerthi

DOI:

关键词:

摘要: In this paper, we propose a unified non-quadratic loss function for regression known as soft insensitive (SILF). SILF is flexible model and possesses most of the desirable characteristics popular functions, such Laplacian, Huber’s Vapnik’s e-insensitive function. We describe properties illustrate our assumption on underlying noise in detail. Moreover, introduction makes it possible to apply Bayesian techniques Support Vector methods. Experimental results simulated real-world datasets indicate feasibility approach.

参考文章(11)
James Tin-Yau Kwok, Integrating the evidence framework and the support vector machine the european symposium on artificial neural networks. pp. 177- 182 ,(1999)
Geoffrey Hinton, Radford M. Neal, Bayesian learning for neural networks ,(1995)
Massimiliano Pontil, Sayan Mukherjee, Federico Girosi, On the Noise Model of Support Vector Machines Regression Lecture Notes in Computer Science. pp. 316- 324 ,(2000) , 10.1007/3-540-40992-0_24
David J. C. MacKay, Bayesian Methods for Backpropagation Networks Models of Neural Networks III. Series: Physics of Neural Networks. pp. 211- 254 ,(1996) , 10.1007/978-1-4612-0723-8_6
Craig Saunders, Alexander Gammerman, Volodya Vovk, None, Ridge Regression Learning Algorithm in Dual Variables international conference on machine learning. pp. 515- 521 ,(1998)
Federico Girosi, MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE LAB, Models of Noise and Robust Estimation Massachusetts Institute of Technology. ,(1991)
Michael E. Tipping, The Relevance Vector Machine neural information processing systems. ,vol. 12, pp. 652- 658 ,(1999)