On the Improved Rates of Convergence for Matérn-type Kernel Ridge Regression, with Application to Calibration of Computer Models

01/01/2020
by   Rui Tuo, et al.
0

Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions, under which the actual rates of convergence of the kernel ridge regression estimator under both the L_2 norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy-O'Hagan approach for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy-O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset