Adaptive Learning Rates for Support Vector Machines Working on Data with Low Intrinsic Dimension

03/13/2020
by   Thomas Hamm, et al.
0

We derive improved regression and classification rates for support vector machines using Gaussian kernels under the assumption that the data has some low-dimensional intrinsic structure that is described by the box-counting dimension. Under some standard regularity assumptions for regression and classification we prove learning rates, in which the dimension of the ambient space is replaced by the box-counting dimension of the support of the data generating distribution. In the regression case our rates are minimax optimal, whereas in the classification case our rates are of the form of the best known. Furthermore, we show that a training validation approach for choosing the hyperparameters of an SVM in a data dependent way achieves the same rates adaptively, that is without any knowledge on the data generating distribution.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset