Alternating Implicit Projected SGD and Its Efficient Variants for Equality-constrained Bilevel Optimization

11/14/2022
by   Quan Xiao, et al.
0

Stochastic bilevel optimization, which captures the inherent nested structure of machine learning problems, is gaining popularity in many recent applications. Existing works on bilevel optimization mostly consider either unconstrained problems or constrained upper-level problems. This paper considers the stochastic bilevel optimization problems with equality constraints both in the upper and lower levels. By leveraging the special structure of the equality constraints problem, the paper first presents an alternating implicit projected SGD approach and establishes the Õ(ϵ^-2) sample complexity that matches the state-of-the-art complexity of ALSET <cit.> for unconstrained bilevel problems. To further save the cost of projection, the paper presents two alternating implicit projection-efficient SGD approaches, where one algorithm enjoys the Õ(ϵ^-2/T) upper-level and O(ϵ^-1.5/T^3/4) lower-level projection complexity with O(T) lower-level batch size, and the other one enjoys Õ(ϵ^-1.5) upper-level and lower-level projection complexity with O(1) batch size. Application to federated bilevel optimization has been presented to showcase the empirical performance of our algorithms. Our results demonstrate that equality-constrained bilevel optimization with strongly-convex lower-level problems can be solved as efficiently as stochastic single-level optimization problems.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset