作者: Zhengtao Ding , Bo Liu
DOI: 10.1016/J.NEUCOM.2021.01.020
关键词:
摘要: Abstract Facing the challenge of distributed computing on processing large-scale data, this paper proposes a consensus-based decentralized training method with communication compression. First, is designed based topology to reduce burden busiest agent and avoid any revealing its locally stored data. The convergence algorithm then analyzed, which demonstrates that trained model can reach minimal empirical risk whole dataset, without sharing data samples. Furthermore, compression combined error-compensated considered costs during process. At last, simulation study shows proposed applicable for both IID non-IID datasets, exhibits much better performance than local method. Besides, an appropriate rate comparable centralized training, while saving lot costs.