DBCal: Density Based Calibration of classifier predictions for uncertainty quantification

04/01/2022
by   Alex Hagen, et al.
0

Measurement of uncertainty of predictions from machine learning methods is important across scientific domains and applications. We present, to our knowledge, the first such technique that quantifies the uncertainty of predictions from a classifier and accounts for both the classifier's belief and performance. We prove that our method provides an accurate estimate of the probability that the outputs of two neural networks are correct by showing an expected calibration error of less than 0.2 than 3 empirically show that the uncertainty returned by our method is an accurate measurement of the probability that the classifier's prediction is correct and, therefore has broad utility in uncertainty propagation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset