Balancing Out Regression Error: Efficient Treatment Effect Estimation without Smooth Propensities

11/30/2017
by   David A. Hirshberg, et al.
0

There has been a recent surge of interest in doubly robust approaches to treatment effect estimation in observational studies, driven by a realization that they can be combined with modern machine learning methods to obtain estimators that pair good finite sample performance with asymptotic efficiency. These methods first fit a regularized regression model to the observed outcomes, and then use a weighted sum of residuals to debias it. Typically the debiasing weights are obtained by inverting a carefully tuned estimate of the propensity scores, and this choice can be justified by asymptotic arguments. However, there is no good reason to believe that an optimally tuned propensity model would also yield optimally tuned debiasing weights in finite samples. In this paper, we study an alternative approach to efficient treatment effect estimation based on using weights that directly optimize worst-case risk bounds; concretely, this amounts to selecting weights that uniformly balance out a class of functions known to capture the errors of the outcome regression with high probability. We provide general conditions under which our method achieves the semiparametric efficiency bound; in particular, unlike existing methods, we do not assume any regularity on the treatment propensities beyond overlap. In extensive experiments, we find that our method, weighting for uniform balance, compares favorably to augmented inverse-propensity weighting and targeted maximum likelihood estimation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset