Information-Theoretic Lower Bounds for Zero-Order Stochastic Gradient Estimation

In this paper we analyze the necessary number of samples to estimate the gradient of any multidimensional smooth (possibly non-convex) function in a zero-order stochastic oracle model. In this model, an estimator has access to noisy values of the function, in order to produce the estimate of the gradient. We also provide an analysis on the sufficient number of samples for the finite difference method, a classical technique in numerical linear algebra. For T samples and d dimensions, our information-theoretic lower bound is Ω(√(d/T)). We show that the finite difference method has rate O(d^4/3/√(T)) for functions with zero third and higher order derivatives. Thus, the finite difference method is not minimax optimal, and therefore there is space for the development of better gradient estimation methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset