FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization
Federated learning is a new distributed machine learning approach, where a model is trained over a set of devices such as mobile phones, while keeping data localized. Federated learning faces several systems challenges including (i) communication bottleneck due to a large number of devices uploading their local updates to a parameter server, and (ii) scalability as the federated network consists of millions of devices some of each are active at any given time. As a result of these system challenges alongside additional challenges such as statistical heterogeneity of data and privacy concerns, designing a provably efficient federated learning method is of significant importance and difficulty. In this paper, we present FedPAQ, a communication-efficient Federated Learning method with Periodic Averaging and Quantization. Our method stands on three key features: (1) Quantized message-passing where the edge nodes quantize their updates before uploading to the parameter server; (2) periodic averaging where models are updated locally at devices and only periodically averaged at the server; and (3) partial device participation where only a fraction of devices participate in each round of the training. These features address the communications and scalability challenges in federated learning. FedPAQ is provably near-optimal in the following sense. Under the problem setup of expected risk minimization with independent and identically distributed data points, when the loss function is strongly convex the proposed method converges to the optimal solution with near-optimal rate, and when the loss function is non-convex it finds a first-order stationary point with near-optimal rate.
READ FULL TEXT