On Kernelization of Supervised Mahalanobis Distance Learners

作者: Ratthachat Chatpatanasiri , Teesid Korsrilabutr , Boonserm Kijsirikul , Pasakorn Tangchanachaianan

DOI:

关键词:

摘要: This paper focuses on the problem of kernelizing an existing supervised Mahalanobis distance learner. The following features are included in paper. Firstly, three popular learners, namely, "neighborhood component analysis", "large margin nearest neighbors" and "discriminant neighborhood embedding", which do not have kernel versions kernelized order to improve their classification performances. Secondly, alternative kernelization framework called "KPCA trick" is presented. Implementing a learner new gains several advantages over standard framework, e.g. no mathematical formulas reprogramming required for implementation, avoids troublesome problems such as singularity, etc. Thirdly, while truths representer theorems just assumptions previous papers related ours, here, formally proven. proofs validate both trick KPCA context learning. Fourthly, unlike works always apply brute force methods select kernel, we investigate two approaches can be efficiently adopted construct appropriate given dataset. Finally, numerical results various real-world datasets

参考文章(28)
Nello Cristianini, John Shawe-Taylor, Kernel Methods for Pattern Analysis ,(2004)
N. Cristianini, J. Shawe-Taylor, J. Kandola, Optimizing Kernel Alignment over Combinations of Kernel s.n.. ,(2002)
Bernhard Scholkopf, Ralf Herbrich, Alex Smola, Robert Williamson, A Generalized Representer Theorem european conference on computational learning theory. pp. 416- 426 ,(2001) , 10.1007/3-540-44581-1_27
Bernhard Schölkopf, Koji Tsuda, Jean-Philippe Vert, Kernel Methods in Computational Biology MIT Press. ,(2004)
Gareth M. James, Variance and Bias for General Loss Functions Machine Learning. ,vol. 51, pp. 115- 135 ,(2003) , 10.1023/A:1022899518027
Rahul Sukthankar, Rong Jin, Yi Liu, Liu Yang, An efficient algorithm for local distance metric learning national conference on artificial intelligence. pp. 543- 548 ,(2006)
Jian Yang, Jing-yu Yang, Why can LDA be performed in PCA transformed space Pattern Recognition. ,vol. 36, pp. 563- 566 ,(2003) , 10.1016/S0031-3203(02)00048-1
Wei Zhang, Xiangyang Xue, Zichen Sun, Yue-Fei Guo, Hong Lu, Optimal dimensionality of metric space for classification international conference on machine learning. pp. 1135- 1142 ,(2007) , 10.1145/1273496.1273639
George Kimeldorf, Grace Wahba, Some results on Tchebycheffian spline functions Journal of Mathematical Analysis and Applications. ,vol. 33, pp. 82- 95 ,(1971) , 10.1016/0022-247X(71)90184-3