Can we trust you? on calibration of a probabilistic object detector for autonomous driving

D Feng, L Rosenbaum, C Glaeser, F Timm… - arXiv preprint arXiv …, 2019 - arxiv.org
arXiv preprint arXiv:1909.12358, 2019arxiv.org
Reliable uncertainty estimation is crucial for perception systems in safe autonomous driving.
Recently, many methods have been proposed to model uncertainties in deep learning
based object detectors. However, the estimated probabilities are often uncalibrated, which
may lead to severe problems in safety critical scenarios. In this work, we identify such
uncertainty miscalibration problems in a probabilistic LiDAR 3D object detection network,
and propose three practical methods to significantly reduce errors in uncertainty calibration …
Reliable uncertainty estimation is crucial for perception systems in safe autonomous driving. Recently, many methods have been proposed to model uncertainties in deep learning based object detectors. However, the estimated probabilities are often uncalibrated, which may lead to severe problems in safety critical scenarios. In this work, we identify such uncertainty miscalibration problems in a probabilistic LiDAR 3D object detection network, and propose three practical methods to significantly reduce errors in uncertainty calibration. Extensive experiments on several datasets show that our methods produce well-calibrated uncertainties, and generalize well between different datasets.
arxiv.org