Support vector regression with ANOVA decomposition kernels

作者: Alex Gammerman , Vladimir Vapnik , Jason Weston , Mark O. Stitson , Chris Watkins

DOI:

关键词:

摘要: Support Vector Machines using ANOVA Decomposition Kernels (SVAD) [Vapng] are a way of imposing structure on multi-dimensional kernels which generated as the tensor product one-dimensional kernels. This gives more accurate control over capacity learning machine (VCdimension). SVAD uses ideas from decomposition methods and extends them to generate directly implement these ideas. is used with spline results show that performs better than respective non kernel. The Boston housing data set UCI has been tested Bagging [Bre94] before [DBK97] compared method.

参考文章(6)
Vladimir Naumovich Vapnik, Estimation of Dependences Based on Empirical Data ,(2010)
Alex J. Smola, Vladimir Vapnik, Steven E. Golowich, Support Vector Method for Function Approximation, Regression Estimation and Signal Processing neural information processing systems. ,vol. 9, pp. 281- 287 ,(1996)
Alex J. Smola, Vladimir Vapnik, Linda Kaufman, Christopher J. C. Burges, Harris Drucker, Support Vector Regression Machines neural information processing systems. ,vol. 9, pp. 155- 161 ,(1996)
Vladimir Naumovich Vapnik, Vlamimir Vapnik, Statistical learning theory John Wiley & Sons. ,(1998)
Leo Breiman, Bagging predictors Machine Learning archive. ,vol. 24, pp. 123- ,(1996) , 10.1023/A:1018054314350