Hölder Gradient Descent and Adaptive Regularization Methods in Banach Spaces for First-Order Points

04/06/2021
by   Serge Gratton, et al.
0

This paper considers optimization of smooth nonconvex functionals in smooth infinite dimensional spaces. A Hölder gradient descent algorithm is first proposed for finding approximate first-order points of regularized polynomial functionals. This method is then applied to analyze the evaluation complexity of an adaptive regularization method which searches for approximate first-order points of functionals with β-Hölder continuous derivatives. It is shown that finding an ϵ-approximate first-order point requires at most O(ϵ^-p+β/p+β-1) evaluations of the functional and its first p derivatives.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset