Asynchronous decentralized accelerated stochastic gradient descent

09/24/2018
by   Guanghui Lan, et al.
0

In this work, we introduce an asynchronous decentralized accelerated stochastic gradient descent type of method for decentralized stochastic optimization, considering communication and synchronization are the major bottlenecks. We establish O(1/ϵ) (resp., O(1/√(ϵ))) communication complexity and O(1/ϵ^2) (resp., O(1/ϵ)) sampling complexity for solving general convex (resp., strongly convex) problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset