Taming Nonconvexity in Kernel Feature Selection—Favorable Properties of the Laplace Kernel

06/17/2021
by   Feng Ruan, et al.
10

Kernel-based feature selection is an important tool in nonparametric statistics. Despite many practical applications of kernel-based feature selection, there is little statistical theory available to support the method. A core challenge is the objective function of the optimization problems used to define kernel-based feature selection are nonconvex. The literature has only studied the statistical properties of the global optima, which is a mismatch, given that the gradient-based algorithms available for nonconvex optimization are only able to guarantee convergence to local minima. Studying the full landscape associated with kernel-based methods, we show that feature selection objectives using the Laplace kernel (and other ℓ_1 kernels) come with statistical guarantees that other kernels, including the ubiquitous Gaussian kernel (or other ℓ_2 kernels) do not possess. Based on a sharp characterization of the gradient of the objective function, we show that ℓ_1 kernels eliminate unfavorable stationary points that appear when using an ℓ_2 kernel. Armed with this insight, we establish statistical guarantees for ℓ_1 kernel-based feature selection which do not require reaching the global minima. In particular, we establish model-selection consistency of ℓ_1-kernel-based feature selection in recovering main effects and hierarchical interactions in the nonparametric setting with n ∼log p samples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset