作者: Kenneth L. Clarkson , Elad Hazan , David P. Woodruff
DOI: 10.1109/FOCS.2010.50
关键词:
摘要: We give sub linear-time approximation algorithms for some optimization problems arising in machine learning, such as training linear classifiers and finding minimum enclosing balls. Our can be extended to kernelized versions of these problems, SVDD, hard margin SVM, $L_2$-SVM, which were not known before. These new use a combination novel sampling techniques multiplicative update algorithm. lower bounds show the running times many our nearly best possible unit-cost RAM model. also implementations semi-streaming setting, obtaining first low pass polylogarithmic space time achieving arbitrary factor.