Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging

Research output: Contribution to journalArticleResearchpeer review

Authors

  • Max-Heinrich Laves
  • Sontje Ihler
  • Jacob F. Fast
  • Lüder A. Kahrs
  • Tobias Ortmaier

Research Organisations

External Research Organisations

  • Hamburg University of Technology (TUHH)
  • Hannover Medical School (MHH)
  • University of Toronto
View graph of relations

Details

Original languageEnglish
Pages (from-to)1-26
Number of pages26
JournalThe Journal of Machine Learning for Biomedical Imaging (MELBA)
VolumeMIDL 2020
Issue number1
Publication statusPublished - 28 Apr 2021

Abstract

The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty

Keywords

    eess.IV, cs.CV

Cite this

Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging. / Laves, Max-Heinrich; Ihler, Sontje; Fast, Jacob F. et al.
In: The Journal of Machine Learning for Biomedical Imaging (MELBA), Vol. MIDL 2020, No. 1, 28.04.2021, p. 1-26.

Research output: Contribution to journalArticleResearchpeer review

Laves, M-H, Ihler, S, Fast, JF, Kahrs, LA & Ortmaier, T 2021, 'Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging', The Journal of Machine Learning for Biomedical Imaging (MELBA), vol. MIDL 2020, no. 1, pp. 1-26. <https://arxiv.org/abs/2104.12376>
Laves, M.-H., Ihler, S., Fast, J. F., Kahrs, L. A., & Ortmaier, T. (2021). Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging. The Journal of Machine Learning for Biomedical Imaging (MELBA), MIDL 2020(1), 1-26. https://arxiv.org/abs/2104.12376
Laves MH, Ihler S, Fast JF, Kahrs LA, Ortmaier T. Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging. The Journal of Machine Learning for Biomedical Imaging (MELBA). 2021 Apr 28;MIDL 2020(1):1-26.
Laves, Max-Heinrich ; Ihler, Sontje ; Fast, Jacob F. et al. / Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging. In: The Journal of Machine Learning for Biomedical Imaging (MELBA). 2021 ; Vol. MIDL 2020, No. 1. pp. 1-26.
Download
@article{178f40b9868b4839805d6fbd16341345,
title = "Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging",
abstract = "The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty ",
keywords = "eess.IV, cs.CV",
author = "Max-Heinrich Laves and Sontje Ihler and Fast, {Jacob F.} and Kahrs, {L{\"u}der A.} and Tobias Ortmaier",
year = "2021",
month = apr,
day = "28",
language = "English",
volume = "MIDL 2020",
pages = "1--26",
number = "1",

}

Download

TY - JOUR

T1 - Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging

AU - Laves, Max-Heinrich

AU - Ihler, Sontje

AU - Fast, Jacob F.

AU - Kahrs, Lüder A.

AU - Ortmaier, Tobias

PY - 2021/4/28

Y1 - 2021/4/28

N2 - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty

AB - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty

KW - eess.IV

KW - cs.CV

M3 - Article

VL - MIDL 2020

SP - 1

EP - 26

JO - The Journal of Machine Learning for Biomedical Imaging (MELBA)

JF - The Journal of Machine Learning for Biomedical Imaging (MELBA)

SN - 2766-905X

IS - 1

ER -

By the same author(s)