作者: Volkan Cevher , Olivier Fercoq , Quang Van Nguyen
DOI:
关键词: Computer science 、 Convex optimization 、 Linear map 、 Basis pursuit 、 Smoothing 、 Applied mathematics 、 Acceleration (differential geometry) 、 Differentiable function 、 Rate of convergence 、 Convex function
摘要: We introduce and analyze an algorithm for the minimization of convex functions that are sum differentiable terms proximable composed with linear operators. The method builds upon recently developed smoothed gap technique. In addition to a precise convergence rate result, valid even in presence inclusion constraints, this new allows explicit treatment gradient can be enhanced line-search. also study consequences restarting acceleration at given frequency. These features not classical primal-dual methods allow us solve difficult large-scale optimization problems. numerically illustrate superior performance on basis pursuit, TV-regularized least squares regression L1 problems against state-of-the-art.