Adaptive Step Sizes in Variance Reduction via Regularization
The main goal of this work is equipping convex and nonconvex problems with Barzilai-Borwein (BB) step size. With the adaptivity of BB step sizes granted, they can fail when the objective function is not strongly convex. To overcome this challenge, the key idea here is to bridge (non)convex problems and strongly convex ones via regularization. The proposed regularization schemes are simple yet effective. Wedding the BB step size with a variance reduction method, known as SARAH, offers a free lunch compared with vanilla SARAH in convex problems. The convergence of BB step sizes in nonconvex problems is also established and its complexity is no worse than other adaptive step sizes such as AdaGrad. As a byproduct, our regularized SARAH methods for convex functions ensure that the complexity to find E[∇ f(x) ^2]≤ϵ is O( (n+1/√(ϵ))ln1/ϵ), improving ϵ dependence over existing results. Numerical tests further validate the merits of proposed approaches.
READ FULL TEXT