A gradient-based variable selection for binary classification in reproducing kernel Hilbert space

09/29/2021
by   Jongkyeong Kang, et al.
0

Variable selection is essential in high-dimensional data analysis. Although various variable selection methods have been developed, most rely on the linear model assumption. This article proposes a nonparametric variable selection method for the large-margin classifier defined by reproducing the kernel Hilbert space (RKHS). we propose a gradient-based representation of the large-margin classifier and then regularize the gradient functions by the group-lasso penalty to obtain sparse gradients that naturally lead to the variable selection. The groupwise-majorization-decent algorithm (GMD, Yang and Zou, 2015) is proposed to efficiently solve the proposed problem with a large number of parameters. We employ the strong sequential rule (Tibshirani et al., 2012) to facilitate the tuning procedure. The selection consistency of the proposed method is established by obtaining the risk bound of the estimated classifier and its gradient. Finally, we demonstrate the promising performance of the proposed method through simulations and real data illustration.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset