Iterated Vector Fields and Conservatism, with Applications to Federated Learning

09/08/2021
by   Zachary Charles, et al.
0

We study iterated vector fields and investigate whether they are conservative, in the sense that they are the gradient of some scalar-valued function. We analyze the conservatism of various iterated vector fields, including gradient vector fields associated to loss functions of generalized linear models. We relate this study to optimization and derive novel convergence results for federated learning algorithms. In particular, we show that for certain classes of functions (including non-convex functions), federated averaging is equivalent to gradient descent on a surrogate loss function. Finally, we discuss a variety of open questions spanning topics in geometry, dynamical systems, and optimization.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset