The L^∞ Learnability of Reproducing Kernel Hilbert Spaces

06/05/2023
by   Hongrui Chen, et al.
0

In this work, we analyze the learnability of reproducing kernel Hilbert spaces (RKHS) under the L^∞ norm, which is critical for understanding the performance of kernel methods and random feature models in safety- and security-critical applications. Specifically, we relate the L^∞ learnability of a RKHS to the spectrum decay of the associate kernel and both lower bounds and upper bounds of the sample complexity are established. In particular, for dot-product kernels on the sphere, we identify conditions when the L^∞ learning can be achieved with polynomial samples. Let d denote the input dimension and assume the kernel spectrum roughly decays as λ_k∼ k^-1-β with β>0. We prove that if β is independent of the input dimension d, then functions in the RKHS can be learned efficiently under the L^∞ norm, i.e., the sample complexity depends polynomially on d. In contrast, if β=1/poly(d), then the L^∞ learning requires exponentially many samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset