作者: Virginia Smith , Simone Forte , Chenxin Ma , Martin Takáč , Michael I Jordan
DOI:
关键词:
摘要: The scale of modern datasets necessitates the development efficient distributed optimization methods for machine learning. We present a general-purpose framework computing environments, CoCoA, that has an communication scheme and is applicable to wide variety problems in learning signal processing. extend cover general non-strongly-convex regularizers, including L1-regularized like lasso, sparse logistic regression, elastic net regularization, show how earlier work can be derived as special case. provide convergence guarantees class convex regularized loss minimization objectives, leveraging novel approach handling regularizers non-smooth functions. resulting markedly improved performance over state-of-the-art methods, we illustrate with extensive set experiments on real datasets.