Global Convergence of Stochastic Gradient Descent for Some Non-convex Matrix Problems

11/05/2014
by   Christopher De Sa, et al.
0

Stochastic gradient descent (SGD) on a low-rank factorization is commonly employed to speed up matrix problems including matrix completion, subspace tracking, and SDP relaxation. In this paper, we exhibit a step size scheme for SGD on a low-rank least-squares problem, and we prove that, under broad sampling conditions, our method converges globally from a random starting point within O(ϵ^-1 n n) steps with constant probability for constant-rank problems. Our modification of SGD relates it to stochastic power iteration. We also show experiments to illustrate the runtime and convergence of the algorithm.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset