Higher-order methods for convex-concave min-max optimization and monotone variational inequalities

07/09/2020
by   Brian Bullins, et al.
0

We provide improved convergence rates for constrained convex-concave min-max problems and monotone variational inequalities with higher-order smoothness. In min-max settings where the p^th-order derivatives are Lipschitz continuous, we give an algorithm HigherOrderMirrorProx that achieves an iteration complexity of O(1/T^p+1/2) when given access to an oracle for finding a fixed point of a p^th-order equation. We give analogous rates for the weak monotone variational inequality problem. For p>2, our results improve upon the iteration complexity of the first-order Mirror Prox method of Nemirovski [2004] and the second-order method of Monteiro and Svaiter [2012]. We further instantiate our entire algorithm in the unconstrained p=2 case.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset