MetaDetect: Uncertainty Quantification and Prediction Quality Estimates for Object Detection

10/04/2020
βˆ™
by   Marius Schubert, et al.
βˆ™
0
βˆ™

In object detection with deep neural networks, the box-wise objectness score tends to be overconfident, sometimes even indicating high confidence in presence of inaccurate predictions. Hence, the reliability of the prediction and therefore reliable uncertainties are of highest interest. In this work, we present a post processing method that for any given neural network provides predictive uncertainty estimates and quality estimates. These estimates are learned by a post processing model that receives as input a hand-crafted set of transparent metrics in form of a structured dataset. Therefrom, we learn two tasks for predicted bounding boxes. We discriminate between true positives (πΌπ‘œπ‘ˆβ‰₯0.5) and false positives (πΌπ‘œπ‘ˆ < 0.5) which we term meta classification, and we predict πΌπ‘œπ‘ˆ values directly which we term meta regression. The probabilities of the meta classification model aim at learning the probabilities of success and failure and therefore provide a modelled predictive uncertainty estimate. On the other hand, meta regression gives rise to a quality estimate. In numerical experiments, we use the publicly available YOLOv3 network and the Faster-RCNN network and evaluate meta classification and regression performance on the Kitti, Pascal VOC and COCO datasets. We demonstrate that our metrics are indeed well correlated with the πΌπ‘œπ‘ˆ. For meta classification we obtain classification accuracies of up to 98.92 value of up to 91.78 other network's objectness score and other baseline approaches. Therefore, we obtain more reliable uncertainty and quality estimates which is particularly interesting in the absence of ground truth.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset