Acceleration with a Ball Optimization Oracle
Consider an oracle which takes a point x and returns the minimizer of a convex function f in an ℓ_2 ball of radius r around x. It is straightforward to show that roughly r^-1log1/ϵ calls to the oracle suffice to find an ϵ-approximate minimizer of f in an ℓ_2 unit ball. Perhaps surprisingly, this is not optimal: we design an accelerated algorithm which attains an ϵ-approximate minimizer with roughly r^-2/3log1/ϵ oracle queries, and give a matching lower bound. Further, we implement ball optimization oracles for functions with locally stable Hessians using a variant of Newton's method. The resulting algorithm applies to a number of problems of practical and theoretical import, improving upon previous results for logistic and ℓ_∞ regression and achieving guarantees comparable to the state-of-the-art for ℓ_p regression.
READ FULL TEXT