A regularization approach for stable estimation of loss development factors

04/17/2020
by   Himchan Jeong, et al.
0

In this article, we show that a new penalty function, which we call log-adjusted absolute deviation (LAAD), emerges if we theoretically extend the Bayesian LASSO using conjugate hyperprior distributional assumptions. We further show that the estimator with LAAD penalty has closed-form in the case with a single covariate and it can be extended to general cases when combined with coordinate descent algorithm with assurance of convergence under mild conditions. This has the advantages of avoiding unnecessary model bias as well as allowing variable selection, which is linked to the choice of tail factor in loss development for claims reserving. We calibrate our proposed model using a multi-line insurance dataset from a property and casualty company where we observe reported aggregate loss along the accident years and development periods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset