Distributed stochastic optimization with gradient tracking over strongly-connected networks

03/18/2019
by   Ran Xin, et al.
0

In this paper, we study distributed stochastic optimization to minimize a sum of smooth and strongly-convex local cost functions over a network of agents, communicating over a strongly-connected graph. Assuming that each agent has access to a stochastic first-order oracle (SFO), we propose a novel distributed method, called S-AB, where each agent uses an auxiliary variable to asymptotically track the gradient of the global cost in expectation. The S-AB algorithm employs row- and column-stochastic weights simultaneously to ensure both consensus and optimality. Since doubly-stochastic weights are not used, S-AB is applicable to arbitrary strongly-connected graphs. We show that under a sufficiently small constant step-size, S-AB converges linearly (in expected mean-square sense) to a neighborhood of the global minimizer. We present numerical simulations based on real-world data sets to illustrate the theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset