作者: Yao-liang Yu , Csaba Szepesv ri
DOI:
关键词:
摘要: In real supervised learning scenarios, it is not uncommon that the training and test sample follow different probability distributions, thus rendering necessity to correct sampling bias. Focusing on a particular covariate shift problem, we derive high confidence bounds for kernel mean matching (KMM) estimator, whose convergence rate turns out depend some regularity measure of regression function also capacity kernel. By comparing KMM with natural plug-in establish superiority former hence provide concrete evidence/ understanding effectiveness under shift.