Stochastic Multi-level Composition Optimization Algorithms with Level-Independent Convergence Rates
In this paper, we study smooth stochastic multi-level composition optimization problems, where the objective function is a nested composition of T functions. We assume access to noisy evaluations of the functions and their gradients, through a stochastic first-order oracle. For solving this class of problems, we propose two algorithms using moving-average stochastic estimates, and analyze their convergence to an ϵ-stationary point of the problem. We show that the first algorithm, which is a generalization of [22] to the T level case, can achieve a sample complexity of 𝒪(1/ϵ^6) by using mini-batches of samples in each iteration. By modifying this algorithm using linearized stochastic estimates of the function values, we improve the sample complexity to 𝒪(1/ϵ^4). This modification also removes the requirement of having a mini-batch of samples in each iteration. To the best of our knowledge, this is the first time that such an online algorithm designed for the (un)constrained multi-level setting, obtains the same sample complexity of the smooth single-level setting, under mild assumptions on the stochastic first-order oracle.
READ FULL TEXT