作者: Sheng Chen , Xunxian Wang , David J. Brown
DOI:
关键词: Boosting (machine learning) 、 Diagonal matrix 、 Mean squared error 、 Algorithm 、 Statistics 、 Kernel method 、 Mathematics 、 Covariance matrix 、 Append 、 Regression 、 Regression analysis
摘要: A novel technique is presented to construct sparse regression models based on the orthogonal least square method with boosting. This tunes mean vector and diagonal covariance matrix of individual regressor by incrementally minimizing training error. weighted optimization developed boosting append regressors one in an forward selection procedure. Experimental results obtained using this demonstrate that it offers a viable alternative existing state-of-art kernel modeling methods for constructing parsimonious models.