作者: Ratthachat Chatpatanasiri , Teesid Korsrilabutr , Boonserm Kijsirikul , Pasakorn Tangchanachaianan
DOI:
关键词:
摘要: This paper focuses on the problem of kernelizing an existing supervised Mahalanobis distance learner. The following features are included in paper. Firstly, three popular learners, namely, "neighborhood component analysis", "large margin nearest neighbors" and "discriminant neighborhood embedding", which do not have kernel versions kernelized order to improve their classification performances. Secondly, alternative kernelization framework called "KPCA trick" is presented. Implementing a learner new gains several advantages over standard framework, e.g. no mathematical formulas reprogramming required for implementation, avoids troublesome problems such as singularity, etc. Thirdly, while truths representer theorems just assumptions previous papers related ours, here, formally proven. proofs validate both trick KPCA context learning. Fourthly, unlike works always apply brute force methods select kernel, we investigate two approaches can be efficiently adopted construct appropriate given dataset. Finally, numerical results various real-world datasets