作者: Angelia Nedic , Alex Olshevsky
DOI:
关键词: Conic optimization 、 Convex analysis 、 Subderivative 、 Discrete mathematics 、 Drift plus penalty 、 Convex combination 、 Convex function 、 Combinatorics 、 Proper convex function 、 Convex optimization 、 Mathematics
摘要: We investigate the convergence rate of recently proposed subgradient-push method for distributed optimization over time-varying directed graphs. The can be implemented in a way without requiring knowledge either number agents or graph sequence; each node is only required to know its out-degree at time. Our main result $O \left((\ln t)/t \right)$ strongly convex functions with Lipschitz gradients even if stochastic gradient samples are available; this asymptotically faster than t)/\sqrt{t} previously known (general) functions.