作者: Martin Takác , Peter Richtárik , Adil Salim , Dmitry Kovalev , Ahmed Khaled
DOI:
关键词:
摘要: We propose basic and natural assumptions under which iterative optimization methods with compressed iterates can be analyzed. This problem is motivated by the practice of federated learning, where a large model stored in cloud before it sent to mobile device, then proceeds training based on local data. develop standard variance reduced methods, establish communication complexity bounds. Our algorithms are first distributed iterates, fixed point iterates.