On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions

02/09/2020
by   Yossi Arjevani, et al.
5

Recent advances in randomized incremental methods for minimizing L-smooth μ-strongly convex finite sums have culminated in tight complexity of Õ((n+√(n L/μ))log(1/ϵ)) and O(n+√(nL/ϵ)), where μ>0 and μ=0, respectively, and n denotes the number of individual functions. Unlike incremental methods, stochastic methods for finite sums do not rely on an explicit knowledge of which individual function is being addressed at each iteration, and as such, must perform at least Ω(n^2) iterations to obtain O(1/n^2)-optimal solutions. In this work, we exploit the finite noise structure of finite sums to derive a matching O(n^2)-upper bound under the global oracle model, showing that this lower bound is indeed tight. Following a similar approach, we propose a novel adaptation of SVRG which is both compatible with stochastic oracles, and achieves complexity bounds of Õ((n^2+n√(L/μ))log(1/ϵ)) and O(n√(L/ϵ)), for μ>0 and μ=0, respectively. Our bounds hold w.h.p. and match in part existing lower bounds of Ω̃(n^2+√(nL/μ)log(1/ϵ)) and Ω̃(n^2+√(nL/ϵ)), for μ>0 and μ=0, respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset