FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

03/28/2022
by   Han Wang, et al.
0

Federated learning is a framework for distributed optimization that places emphasis on communication efficiency. In particular, it follows a client-server broadcast model and is particularly appealing because of its ability to accommodate heterogeneity in client compute and storage resources, non-i.i.d. data assumptions, and data privacy. Our contribution is to offer a new federated learning algorithm, FedADMM, for solving non-convex composite optimization problems with non-smooth regularizers. We prove converges of FedADMM for the case when not all clients are able to participate in a given communication round under a very general sampling model.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset