Order-Optimal Error Bounds for Noisy Kernel-Based Bayesian Quadrature

02/22/2022
by   Xu Cai, et al.
0

In this paper, we study the sample complexity of noisy Bayesian quadrature (BQ), in which we seek to approximate an integral based on noisy black-box queries to the underlying function. We consider functions in a Reproducing Kernel Hilbert Space (RKHS) with the Matérn-ν kernel, focusing on combinations of the parameter ν and dimension d such that the RKHS is equivalent to a Sobolev class. In this setting, we provide near-matching upper and lower bounds on the best possible average error. Specifically, we find that when the black-box queries are subject to Gaussian noise having variance σ^2, any algorithm making at most T queries (even with adaptive sampling) must incur a mean absolute error of Ω(T^-ν/d-1 + σ T^-1/2), and there exists a non-adaptive algorithm attaining an error of at most O(T^-ν/d-1 + σ T^-1/2). Hence, the bounds are order-optimal, and establish that there is no adaptivity gap in terms of scaling laws.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro