Restricted distance-type Gaussian estimators based on density power divergence and their applications in hypothesis testing

01/31/2023
by   Ángel Felipe, et al.
0

Zhang (2019) presented a general estimation approach based on the Gaussian distribution for general parametric models where the likelihood of the data is difficult to obtain or unknown, but the mean and variance-covariance matrix are known. Castilla and Zografos (2021) extended the method to density power divergence-based estimators, which are more robust than the likelihood-based Gaussian estimator against data contamination. In this paper we introduce the restricted minimum density power divergence Gaussian estimator (MDPDGE) and study its main asymptotic properties. Also, we examine it robustness through its influence function analysis. Restricted estimators are required in many practical situations, in special in testing composite null hypothesis, and provide here constrained estimators to inherent restrictions of the underlying distribution. Further, we derive robust Rao-type test statistics based on the MDPDGE for testing simple null hypothesis and we deduce explicit expressions for some main important distributions. Finally, we empirically evaluate the efficiency and robustness of the method through a simulation study.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset