Explaining black box decisions by Shapley cohort refinement

11/01/2019
by   Masayoshi Mase, et al.
0

We introduce a variable importance measure to explain the importance of individual variables to a decision made by a black box function. Our measure is based on the Shapley value from cooperative game theory. Measures of variable importance usually work by changing the value of one or more variables with the others held fixed and then recomputing the function of interest. That approach is problematic because it can create very unrealistic combinations of predictors that never appear in practice or that were never present when the prediction function was being created. Our cohort refinement Shapley approach measures variable importance without using any data points that were not actually observed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset