A sharp upper bound for sampling numbers in L_2
For a class F of complex-valued functions on a set D, we denote by g_n(F) its sampling numbers, i.e., the minimal worst-case error on F, measured in L_2, that can be achieved with a recovery algorithm based on n function evaluations. We prove that there is a universal constant c>0 such that, if F is the unit ball of a separable reproducing kernel Hilbert space, then g_cn(F)^2 ≤ 1/n∑_k≥ n d_k(F)^2, where d_k(F) are the Kolmogorov widths (or approximation numbers) of F in L_2. We also obtain similar upper bounds for more general classes F, including all compact subsets of the space of continuous functions on a bounded domain D⊂ℝ^d, and show that these bounds are sharp by providing examples where the converse inequality holds up to a constant.
READ FULL TEXT