Towards Unified Acceleration of High-Order Algorithms under Hölder Continuity and Uniform Convexity

06/03/2019
by   Chaobing Song, et al.
0

In this paper, through a very intuitive vanilla proximal method perspective, we derive accelerated high-order optimization algorithms for minimizing a convex function that has Hölder continuous derivatives. In this general convex setting, we propose a unified acceleration algorithm with an iteration complexity that matches the lower iteration complexity bound given in grapiglia2019tensor. If the function is further uniformly convex, we propose a general restart scheme. The iteration complexity of the algorithm matches existing lower bounds in most important cases. For practical implementation, we introduce a new and effective heuristic that significantly simplifies the binary search procedure required by the algorithm, which makes the algorithm in general settings as efficient as the special case grapiglia2019tensor. On large-scale classification datasets, our algorithm demonstrates clear and consistent advantages of high-order acceleration methods over first-order ones, in terms of run-time complexity. Our formulation considers the more general composite setting in which the objective function may contain a second possibly non-smooth convex term. Our analysis and proofs are also applicable to the general case in which the high-order smoothness conditions are with respect to non-Euclidean norms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset