作者: Pieter Abbeel , Andrew Y. Ng , Honglak Lee , Sun-In Lee
DOI:
关键词:
摘要: L1 regularized logistic regression is now a workhorse of machine learning: it widely used for many classification problems, particularly ones with features. requires solving convex optimization problem. However, standard algorithms problems do not scale well enough to handle the large datasets encountered in practical settings. In this paper, we propose an efficient algorithm regression. Our iteratively approximates objective function by quadratic approximation at current point, while maintaining constraint. each iteration, uses LARS (Least Angle Regression) solve resulting constrained theoretical results show that our guaranteed converge global optimum. experiments significantly outperforms problems. Moreover, four previously published were specifically designed