Tuning in ridge logistic regression to solve separation
Separation in logistic regression is a common problem causing failure of the iterative estimation process when finding maximum likelihood estimates. Firth's correction (FC) was proposed as a solution, providing estimates also in presence of separation. In this paper we evaluate whether ridge regression (RR) could be considered instead, specifically, if it could reduce the mean squared error (MSE) of coefficient estimates in comparison to FC. In RR the tuning parameter determining the penalty strength is usually obtained by minimizing some measure of the out-of-sample prediction error or information criterion. However, in presence of separation tuning these measures can yield an optimized value of zero (no shrinkage), and hence cannot provide a universal solution. We derive a new bootstrap based tuning criterion B that always leads to shrinkage. Moreover, we demonstrate how valid inference can be obtained by combining resampled profile penalized likelihood functions. Our approach is illustrated in an example from oncology and its performance is compared to FC in a simulation study. Our simulations showed that in analyses of small and sparse datasets and with many correlated covariates B-tuned RR can yield coefficient estimates with MSE smaller than FC and confidence intervals that approximately achieve nominal coverage probabilities.
READ FULL TEXT