作者: Hirotaka Hachiya , Hyunha Nam , Masashi Sugiyama , Jaak Simm , Makoto Yamada
DOI:
关键词: Kernel (statistics) 、 Polynomial kernel 、 Logistic model tree 、 Computer science 、 Principal component regression 、 Variable kernel density estimation 、 Kernel regression 、 Linear model 、 Artificial intelligence 、 Probabilistic classification 、 Pattern recognition
摘要: The least-squares probabilistic classifier (LSPC) is a computationally efficient alternative to kernel logistic regression (KLR). A key idea for the speedup that, unlike KLR that uses maximum likelihood estimation log-linear model, LSPC linear model. This allows us obtain global solution analytically in classwise manner. In exchange speedup, however, this formulation does not necessarily produce non-negative estimate. Nevertheless, consistency of guaranteed large sample limit, and rounding up negative estimate zero finite cases was demonstrated degrade classification performance experiments. Thus, practically useful classifier. paper, we give an overview its extentions covariate shift, multi-task, multi-label scenarios. MATLAB implementation available from ‘http://sugiyama-www.cs.titech.ac. jp/ sugi/software/LSPC/’.