Sequential convergence of AdaGrad algorithm for smooth convex optimization

11/24/2020
by   Cheik Traoré, et al.
0

We prove that the iterates produced by, either the scalar step size variant, or the coordinatewise variant of AdaGrad algorithm, are convergent sequences when applied to convex objective functions with Lipschitz gradient. The key insight is to remark that such AdaGrad sequences satisfy a variable metric quasi-Fejér monotonicity property, which allows to prove convergence.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset