Incremental Methods for Weakly Convex Optimization

07/26/2019
by   Xiao Li, et al.
0

We consider incremental algorithms for solving weakly convex optimization problems, a wide class of (possibly nondifferentiable) nonconvex optimization problems. We will analyze incremental (sub)-gradient descent, incremental proximal point algorithm and incremental prox-linear algorithm in this paper. We show that the convergence rate of the three incremental algorithms is O(k^-1/4) under weakly convex setting. This extends the convergence theory of incremental methods from convex optimization to nondifferentiable nonconvex regime. When the weakly convex function satisfies an additional regularity condition called sharpness, we show that all the three incremental algorithms with a geometrical diminishing stepsize and an appropriate initialization converge linearly to the optimal solution set. We conduct experiments on robust matrix sensing and robust phase retrieval to illustrate the superior convergence property of the three incremental methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset