Stochastic Cubic Regularization for Fast Nonconvex Optimization

11/08/2017
by   Nilesh Tripuraneni, et al.
0

This paper proposes a stochastic variant of a classic algorithm---the cubic-regularized Newton method [Nesterov and Polyak 2006]. The proposed algorithm efficiently escapes saddle points and finds approximate local minima for general smooth, nonconvex functions in only Õ(ϵ^-3.5) stochastic gradient and stochastic Hessian-vector product evaluations. The latter can be computed as efficiently as stochastic gradients. This improves upon the Õ(ϵ^-4) rate of stochastic gradient descent. Our rate matches the best-known result for finding local minima without requiring any delicate acceleration or variance-reduction techniques.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset