Wild Residual Bootstrap Inference for Penalized Quantile Regression with Heteroscedastic Errors

07/20/2018
by   Lan Wang, et al.
0

We consider a heteroscedastic regression model in which some of the regression coefficients are zero but it is not known which ones. Penalized quantile regression is a useful approach for analyzing such data. By allowing different covariates to be relevant for modeling conditional quantile functions at different quantile levels, it provides a more complete picture of the conditional distribution of a response variable than mean regression. Existing work on penalized quantile regression has been mostly focused on point estimation. Although bootstrap procedures have recently been shown to be effective for inference for penalized mean regression, they are not directly applicable to penalized quantile regression with heteroscedastic errors. We prove that a wild residual bootstrap procedure for unpenalized quantile regression is asymptotically valid for approximating the distribution of a penalized quantile regression estimator with an adaptive L_1 penalty and that a modified version can be used to approximate the distribution of L_1-penalized quantile regression estimator. The new methods do not need to estimate the unknown error density function. We establish consistency, demonstrate finite sample performance, and illustrate the applications on a real data example.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset