作者: Jing Lu , Steven CH Hoi , Jialei Wang , Peilin Zhao , Zhi-Yong Liu
关键词:
摘要: In this paper, we present a new framework for large scale online kernel learning, making methods efficient and scalable large-scale learning applications. Unlike the regular budget scheme that usually uses some maintenance strategies to bound number of support vectors, our explores completely different approach functional approximation techniques make subsequent task scalable. Specifically, two machine algorithms: (i) Fourier Online Gradient Descent (FOGD) algorithm applies random features approximating functions; (ii) Nystrom (NOGD) method approximate matrices. We explore these approaches tackle three tasks: binary classification, multi-class regression. The encouraging results experiments on datasets validate effectiveness efficiency proposed algorithms, them potentially more practical than family existing approaches.