作者: Yuh-Jye Lee , Wen-Feng Hsieh , Chien-Ming Huang
DOI: 10.1109/TKDE.2005.77
关键词: Least squares support vector machine 、 Support vector machine 、 Mathematics 、 Kernel (linear algebra) 、 Algorithm 、 Convex optimization 、 Quadratic programming 、 Smoothing 、 Kernel method 、 Computational complexity theory
摘要: A new smoothing strategy for solving /spl epsi/-support vector regression (/spl epsi/-SVR), tolerating a small error in fitting given data set linearly or nonlinearly, is proposed this paper. Conventionally, epsi/-SVR formulated as constrained minimization problem, namely, convex quadratic programming problem. We apply the techniques that have been used support machine classification, to replace epsi/-insensitive loss function by an accurate smooth approximation. This will allow us solve unconstrained problem directly. term reformulated epsi/-smooth epsi/-SSVR). also prescribe Newton-Armijo algorithm has shown be convergent globally and quadratically our epsi/-SSVR. In order handle case of nonlinear with massive set, we introduce reduced kernel technique paper avoid computational difficulties dealing huge fully dense matrix. Numerical results comparisons are demonstrate effectiveness speed algorithm.