作者: Amirhossein Reisizadeh , Aryan Mokhtari , Hamed Hassani , Ramtin Pedarsani
关键词:
摘要: We consider the problem of decentralized consensus optimization, where sum $n$ convex functions are minimized over distributed agents that form a connected network. In particular, we case communicated local decision variables among nodes quantized in order to alleviate communication bottleneck optimization. propose Quantized Decentralized Gradient Descent (QDGD) algorithm, which update their by combining information received from neighbors with information. prove under standard strong convexity and smoothness assumptions for cost functions, QDGD achieves vanishing mean solution error. To best our knowledge, this is first algorithm error presence quantization noise. Moreover, provide simulation results show tight agreement between derived theoretical convergence rate experimental results.