Faster Subgradient Methods for Functions with Hölderian Growth

04/01/2017
by   Patrick R. Johnstone, et al.
0

The purpose of this manuscript is to derive new convergence results for several subgradient methods for minimizing nonsmooth convex functions with Hölderian growth. The growth condition is satisfied in many applications and includes functions with quadratic growth and functions with weakly sharp minima as special cases. To this end there are four main contributions. First, for a constant and sufficiently small stepsize, we show that the subgradient method achieves linear convergence up to a certain region including the optimal set with error of the order of the stepsize. Second, we derive nonergodic convergence rates for the subgradient method under nonsummable decaying stepsizes. Thirdly if appropriate problem parameters are known we derive a possibly-summable stepsize which obtains a much faster convergence rate. Finally we develop a novel "descending stairs" stepsize which obtains this faster convergence rate but also obtains linear convergence for the special case of weakly sharp functions. We also develop a variant of the "descending stairs" stepsize which achieves essentially the same convergence rate without requiring an error bound constant which is difficult to estimate in practice.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset