作者: Byungki Byun , Chin-Hui Lee
DOI: 10.1109/ICASSP.2011.5946732
关键词:
摘要: We propose a kernelized maximal-figure-of-merit (MFoM) learning approach to efficiently training nonlinear model using subspace distance minimization. In particular, fixed, small number of samples are chosen in way that the between function spaces constructed with subset and entire data set is minimized. This construction enables us learn while keeping resulting nearly optimal compared from whole set. show can be minimized through Nystrom extension. Experimental results on various machine problems demonstrate clear advantages proposed technique over case where space built randomly selected samples. Additional comparisons trained achieves comparable reducing time tremendously.