Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference

Research output: Working paper/PreprintPreprint

Authors

  • Max-Heinrich Laves
  • Sontje Ihler
  • Karl-Philipp Kortmann
  • Tobias Ortmaier

Research Organisations

View graph of relations

Details

Original languageEnglish
Number of pages8
Publication statusE-pub ahead of print - 2019

Abstract

Model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. The uncertainty does not represent the model error well. In this paper, temperature scaling is extended to dropout variational inference to calibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration of uncertainty. The effectiveness of this approach is evaluated on CIFAR-10/100 for recent CNN architectures. Experimental results show, that temperature scaling considerably reduces miscalibration by means of UCE and enables robust rejection of uncertain predictions. The proposed approach can easily be derived from frequentist temperature scaling and yields well-calibrated model uncertainty. It is simple to implement and does not affect the model accuracy.

Keywords

    cs.LG, stat.ML

Cite this

Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference. / Laves, Max-Heinrich; Ihler, Sontje; Kortmann, Karl-Philipp et al.
2019.

Research output: Working paper/PreprintPreprint

Laves MH, Ihler S, Kortmann KP, Ortmaier T. Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference. 2019. Epub 2019. doi: 10.48550/arXiv.1909.13550
Download
@techreport{a4b95111ba3643148d1811cc95ca699e,
title = "Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference",
abstract = "Model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. The uncertainty does not represent the model error well. In this paper, temperature scaling is extended to dropout variational inference to calibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration of uncertainty. The effectiveness of this approach is evaluated on CIFAR-10/100 for recent CNN architectures. Experimental results show, that temperature scaling considerably reduces miscalibration by means of UCE and enables robust rejection of uncertain predictions. The proposed approach can easily be derived from frequentist temperature scaling and yields well-calibrated model uncertainty. It is simple to implement and does not affect the model accuracy. ",
keywords = "cs.LG, stat.ML",
author = "Max-Heinrich Laves and Sontje Ihler and Karl-Philipp Kortmann and Tobias Ortmaier",
note = "Accepted at 4th workshop on Bayesian Deep Learning (NeurIPS 2019)",
year = "2019",
doi = "10.48550/arXiv.1909.13550",
language = "English",
type = "WorkingPaper",

}

Download

TY - UNPB

T1 - Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference

AU - Laves, Max-Heinrich

AU - Ihler, Sontje

AU - Kortmann, Karl-Philipp

AU - Ortmaier, Tobias

N1 - Accepted at 4th workshop on Bayesian Deep Learning (NeurIPS 2019)

PY - 2019

Y1 - 2019

N2 - Model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. The uncertainty does not represent the model error well. In this paper, temperature scaling is extended to dropout variational inference to calibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration of uncertainty. The effectiveness of this approach is evaluated on CIFAR-10/100 for recent CNN architectures. Experimental results show, that temperature scaling considerably reduces miscalibration by means of UCE and enables robust rejection of uncertain predictions. The proposed approach can easily be derived from frequentist temperature scaling and yields well-calibrated model uncertainty. It is simple to implement and does not affect the model accuracy.

AB - Model uncertainty obtained by variational Bayesian inference with Monte Carlo dropout is prone to miscalibration. The uncertainty does not represent the model error well. In this paper, temperature scaling is extended to dropout variational inference to calibrate model uncertainty. Expected uncertainty calibration error (UCE) is presented as a metric to measure miscalibration of uncertainty. The effectiveness of this approach is evaluated on CIFAR-10/100 for recent CNN architectures. Experimental results show, that temperature scaling considerably reduces miscalibration by means of UCE and enables robust rejection of uncertain predictions. The proposed approach can easily be derived from frequentist temperature scaling and yields well-calibrated model uncertainty. It is simple to implement and does not affect the model accuracy.

KW - cs.LG

KW - stat.ML

U2 - 10.48550/arXiv.1909.13550

DO - 10.48550/arXiv.1909.13550

M3 - Preprint

BT - Well-calibrated Model Uncertainty with Temperature Scaling for Dropout Variational Inference

ER -

By the same author(s)