The Sample Complexities of Global Lipschitz Optimization
We study the problem of black-box optimization of a Lipschitz function f defined on a compact subset X of R^d, both via algorithms that certify the accuracy of their recommendations and those that do not. We investigate their sample complexities, i.e., the number of samples needed to either reach or certify a given accuracy epsilon. We start by proving a tighter bound for the well-known DOO algorithm [Perevozchikov, 1990, Munos, 2011] that matches the best existing upper bounds for (more computationally challenging) non-certified algorithms. We then introduce and analyze a new certified version of DOO and prove a matching f-dependent lower bound (up to logarithmic terms) for all certified algorithms. Afterwards, we show that this optimal quantity is proportional to ∫_X dx/(max(f) - f(x) + epsilon)^d, solving as a corollary a three-decade-old conjecture by Hansen et al. [1991]. Finally, we show how to control the sample complexity of state-of-the-art non-certified algorithms with an integral reminiscent of the Dudley-entropy integral.
READ FULL TEXT