Information Complexity Criterion for Model Selection in Robust Regression Using A New Robust Penalty Term

12/04/2020
by   Esra Pamukçu, et al.
0

Model selection is basically a process of finding the best model from the subset of models in which the explanatory variables are effective on the response variable. The log likelihood function for the lack of fit term and a specified penalty term are used as two parts in a model selection criteria. In this paper, we derive a new tool for the model selection in robust regression. We introduce a new definition of relative entropy based on objective functions. Due to the analytical simplicity, we use Huber's objective function ρ_H and propose our specified penalty term C_0^ρ_H to derive new Information Complexity Criterion (RICOMP_C_0^ρ_H) as a robust model selection tool. Additionally, by using the properties of C_0^ρ_H, we propose a new value of tuning parameter called k_C_0 for the Huber's ρ_H. If a contamination to normal distribution exists, RICOMP_C_0^ρ_H chooses the true model better than the rival ones. Monte Carlo Simulation studies are carried out to show the utility both of k_C_0 and RICOMP_C_0^ρ_H. A real data example is also given.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset