Distributionally Robust Survival Analysis: A Novel Fairness Loss Without Demographics

11/18/2022
by   Shu Hu, et al.
0

We propose a general approach for training survival analysis models that minimizes a worst-case error across all subpopulations that are large enough (occurring with at least a user-specified minimum probability). This approach uses a training loss function that does not know any demographic information to treat as sensitive. Despite this, we demonstrate that our proposed approach often scores better on recently established fairness metrics (without a significant drop in prediction accuracy) compared to various baselines, including ones which directly use sensitive demographic information in their training loss. Our code is available at: https://github.com/discovershu/DRO_COX

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset