作者: Ramtin Pedarsani , Ali Jadbabaie , Aryan Mokhtari , Amirhossein Reisizadeh , Hamed Hassani
DOI:
关键词:
摘要: Federated learning is a distributed framework according to which model trained over set of devices, while keeping data localized. This faces several systems-oriented challenges include (i) communication bottleneck since large number devices upload their local updates parameter server, and (ii) scalability as the federated network consists millions devices. Due these systems well issues related statistical heterogeneity privacy concerns, designing provably efficient method significant importance yet it remains challenging. In this paper, we present FedPAQ, communication-efficient Learning with Periodic Averaging Quantization. FedPAQ relies on three key features: (1) periodic averaging where models are updated locally at only periodically averaged server; (2) partial device participation fraction participate in each round training; (3) quantized message-passing edge nodes quantize before uploading server. These features address communications learning. We also show that achieves near-optimal theoretical guarantees for strongly convex non-convex loss functions empirically demonstrate communication-computation tradeoff provided by our method.