Adaptive Accelerated (Extra-)Gradient Methods with Variance Reduction

01/28/2022
by   Zijian Liu, et al.
0

In this paper, we study the finite-sum convex optimization problem focusing on the general convex case. Recently, the study of variance reduced (VR) methods and their accelerated variants has made exciting progress. However, the step size used in the existing VR algorithms typically depends on the smoothness parameter, which is often unknown and requires tuning in practice. To address this problem, we propose two novel adaptive VR algorithms: Adaptive Variance Reduced Accelerated Extra-Gradient (AdaVRAE) and Adaptive Variance Reduced Accelerated Gradient (AdaVRAG). Our algorithms do not require knowledge of the smoothness parameter. AdaVRAE uses 𝒪(nloglog n+√(nβ/ϵ)) gradient evaluations and AdaVRAG uses 𝒪(nloglog n+√(nβlogβ/ϵ)) gradient evaluations to attain an 𝒪(ϵ)-suboptimal solution, where n is the number of functions in the finite sum and β is the smoothness parameter. This result matches the best-known convergence rate of non-adaptive VR methods and it improves upon the convergence of the state of the art adaptive VR method, AdaSVRG. We demonstrate the superior performance of our algorithms compared with previous methods in experiments on real-world datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset