Calibration of Model Uncertainty for Dropout Variational Inference

Publikation: Arbeitspapier/PreprintPreprint

Autoren

  • Max-Heinrich Laves
  • Sontje Ihler
  • Karl-Philipp Kortmann
  • Tobias Ortmaier

Organisationseinheiten

Forschungs-netzwerk anzeigen

Details

OriginalspracheEnglisch
PublikationsstatusElektronisch veröffentlicht (E-Pub) - 20 Juni 2020

Abstract

The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration. The effectiveness of recalibration is evaluated on CIFAR-10/100 and SVHN for recent CNN architectures. Experimental results show that logit scaling considerably reduce miscalibration by means of UCE. Well-calibrated uncertainty enables reliable rejection of uncertain predictions and robust detection of out-of-distribution data.

Zitieren

Calibration of Model Uncertainty for Dropout Variational Inference. / Laves, Max-Heinrich; Ihler, Sontje; Kortmann, Karl-Philipp et al.
2020.

Publikation: Arbeitspapier/PreprintPreprint

Laves, M.-H., Ihler, S., Kortmann, K.-P., & Ortmaier, T. (2020). Calibration of Model Uncertainty for Dropout Variational Inference. Vorabveröffentlichung online. https://arxiv.org/abs/2006.11584
Laves MH, Ihler S, Kortmann KP, Ortmaier T. Calibration of Model Uncertainty for Dropout Variational Inference. 2020 Jun 20. Epub 2020 Jun 20.
Laves, Max-Heinrich ; Ihler, Sontje ; Kortmann, Karl-Philipp et al. / Calibration of Model Uncertainty for Dropout Variational Inference. 2020.
Download
@techreport{195a1500915245dca81bc174b40040b6,
title = "Calibration of Model Uncertainty for Dropout Variational Inference",
abstract = " The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration. The effectiveness of recalibration is evaluated on CIFAR-10/100 and SVHN for recent CNN architectures. Experimental results show that logit scaling considerably reduce miscalibration by means of UCE. Well-calibrated uncertainty enables reliable rejection of uncertain predictions and robust detection of out-of-distribution data. ",
keywords = "cs.LG, stat.ML",
author = "Max-Heinrich Laves and Sontje Ihler and Karl-Philipp Kortmann and Tobias Ortmaier",
year = "2020",
month = jun,
day = "20",
language = "English",
type = "WorkingPaper",

}

Download

TY - UNPB

T1 - Calibration of Model Uncertainty for Dropout Variational Inference

AU - Laves, Max-Heinrich

AU - Ihler, Sontje

AU - Kortmann, Karl-Philipp

AU - Ortmaier, Tobias

PY - 2020/6/20

Y1 - 2020/6/20

N2 - The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration. The effectiveness of recalibration is evaluated on CIFAR-10/100 and SVHN for recent CNN architectures. Experimental results show that logit scaling considerably reduce miscalibration by means of UCE. Well-calibrated uncertainty enables reliable rejection of uncertain predictions and robust detection of out-of-distribution data.

AB - The model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. In this paper, different logit scaling methods are extended to dropout variational inference to recalibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration. The effectiveness of recalibration is evaluated on CIFAR-10/100 and SVHN for recent CNN architectures. Experimental results show that logit scaling considerably reduce miscalibration by means of UCE. Well-calibrated uncertainty enables reliable rejection of uncertain predictions and robust detection of out-of-distribution data.

KW - cs.LG

KW - stat.ML

M3 - Preprint

BT - Calibration of Model Uncertainty for Dropout Variational Inference

ER -

Von denselben Autoren