Robustness of Quantum Algorithms for Nonconvex Optimization

12/05/2022
by   Weiyuan Gong, et al.
0

Recent results suggest that quantum computers possess the potential to speed up nonconvex optimization problems. However, a crucial factor for the implementation of quantum optimization algorithms is their robustness against experimental and statistical noises. In this paper, we systematically study quantum algorithms for finding an ϵ-approximate second-order stationary point (ϵ-SOSP) of a d-dimensional nonconvex function, a fundamental problem in nonconvex optimization, with noisy zeroth- or first-order oracles as inputs. We first prove that, up to noise of O(ϵ^10/d^5), accelerated perturbed gradient descent with quantum gradient estimation takes O(log d/ϵ^1.75) quantum queries to find an ϵ-SOSP. We then prove that perturbed gradient descent is robust to the noise of O(ϵ^6/d^4) and O(ϵ/d^0.5+ζ) for ζ>0 on the zeroth- and first-order oracles, respectively, which provides a quantum algorithm with poly-logarithmic query complexity. We then propose a stochastic gradient descent algorithm using quantum mean estimation on the Gaussian smoothing of noisy oracles, which is robust to O(ϵ^1.5/d) and O(ϵ/√(d)) noise on the zeroth- and first-order oracles, respectively. The quantum algorithm takes O(d^2.5/ϵ^3.5) and O(d^2/ϵ^3) queries to the two oracles, giving a polynomial speedup over the classical counterparts. Moreover, we characterize the domains where quantum algorithms can find an ϵ-SOSP with poly-logarithmic, polynomial, or exponential number of queries in d, or the problem is information-theoretically unsolvable even by an infinite number of queries. In addition, we prove an Ω(ϵ^-12/7) lower bound in ϵ for any randomized classical and quantum algorithm to find an ϵ-SOSP using either noisy zeroth- or first-order oracles.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset