作者: H. Vincent Poor , Gauri Joshi , Jianyu Wang , Qinghua Liu , Hao Liang
DOI:
关键词:
摘要: In federated optimization, heterogeneity in the clients' local datasets and computation speeds results large variations number of updates performed by each client communication round. Naive weighted aggregation such models causes objective inconsistency, that is, global model converges to a stationary point mismatched function which can be arbitrarily different from true objective. This paper provides general framework analyze convergence heterogeneous optimization algorithms. It subsumes previously proposed methods as FedAvg FedProx first principled understanding solution bias slowdown due inconsistency. Using insights this analysis, we propose FedNova, normalized averaging method eliminates inconsistency while preserving fast error convergence.