A New Loss Function for Temperature Scaling to have Better Calibrated Deep Networks

10/27/2018
by   Azadeh Sadat Mozafari, et al.
0

However Deep neural networks recently have achieved impressive results for different tasks, they suffer from poor uncertainty prediction. Temperature Scaling(TS) is an efficient post-processing method for calibrating DNNs toward to have more accurate uncertainty prediction. TS relies on a single parameter T which softens the logit layer of a DNN and the optimal value of it is found by minimizing on Negative Log Likelihood (NLL) loss function. In this paper, we discuss about weakness of NLL loss function, especially for DNNs with high accuracy and propose a new loss function called Attended-NLL which can improve TS calibration ability significantly

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset