Stochastic Variance-Reduced Cubic Regularized Newton Method

02/13/2018
by   Dongruo Zhou, et al.
0

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an (ϵ,√(ϵ))-approximately local minimum within Õ(n^4/5/ϵ^3/2) second-order oracle calls, which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset