作者: Yuyuan Ouyang , Trevor Squires
DOI:
关键词:
摘要: In this paper, we present a first-order projection-free method, namely, the universal conditional gradient sliding (UCGS) for solving $\varepsilon$-approximate solutions to convex differentiable optimization problems. For objective functions with H\"older continuous gradients, show that UCGS is able terminate $\varepsilon$-solutions at most $O((M_\nu D_X^{1+\nu}/{\varepsilon})^{2/(1+3\nu)})$ evaluations and D_X^{1+\nu}/{\varepsilon})^{4/(1+3\nu)})$ linear optimizations, where $\nu\in (0,1]$ $M_\nu>0$ are exponent constant of condition. Furthermore, perform such computations without requiring any specific knowledge smoothness information $\nu$ $M_\nu$. weakly smooth case when (0,1)$, both complexity results improve current state-of-the-art D_X^{1+\nu}/{\varepsilon})^{1/\nu})$ on method achieved by method. Within class sliding-type algorithms, best our knowledge, first time algorithm not only but also overall computing an approximate solution. $\nu=1$, matches result adds more features allowing practical implementation.