Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning

Research output: Contribution to journalConference articleResearchpeer review

Authors

  • Max Heinrich Laves
  • Sontje Ihler
  • Jacob F. Fast
  • Lüder A. Kahrs
  • Tobias Ortmaier

Research Organisations

External Research Organisations

  • Centre for Image Guided Innovation and Therapeutic Intervention (CIGITI)
  • University of Toronto
View graph of relations

Details

Original languageEnglish
Pages (from-to)393-412
Number of pages20
JournalProceedings of Machine Learning Research
Volume121
Publication statusPublished - 2020
Event3rd Conference on Medical Imaging with Deep Learning, MIDL 2020 - Virtual, Online, Canada
Duration: 6 Jul 20208 Jul 2020

Abstract

The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of predictive uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show why predictive uncertainty is systematically underestimated. We suggest using σ scaling with a single scalar value; a simple, yet effective calibration method for both aleatoric and epistemic uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In all experiments, σ scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at: github.com/mlaves/well-calibrated-regression-uncertainty.

Keywords

    Bayesian approximation, variational inference

ASJC Scopus subject areas

Cite this

Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning. / Laves, Max Heinrich; Ihler, Sontje; Fast, Jacob F. et al.
In: Proceedings of Machine Learning Research, Vol. 121, 2020, p. 393-412.

Research output: Contribution to journalConference articleResearchpeer review

Laves, MH, Ihler, S, Fast, JF, Kahrs, LA & Ortmaier, T 2020, 'Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning', Proceedings of Machine Learning Research, vol. 121, pp. 393-412. <https://proceedings.mlr.press/v121/laves20a.html>
Laves, M. H., Ihler, S., Fast, J. F., Kahrs, L. A., & Ortmaier, T. (2020). Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning. Proceedings of Machine Learning Research, 121, 393-412. https://proceedings.mlr.press/v121/laves20a.html
Laves MH, Ihler S, Fast JF, Kahrs LA, Ortmaier T. Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning. Proceedings of Machine Learning Research. 2020;121:393-412.
Laves, Max Heinrich ; Ihler, Sontje ; Fast, Jacob F. et al. / Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning. In: Proceedings of Machine Learning Research. 2020 ; Vol. 121. pp. 393-412.
Download
@article{efa8002d06b24e8ca92971ef2cd8811c,
title = "Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning",
abstract = "The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of predictive uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show why predictive uncertainty is systematically underestimated. We suggest using σ scaling with a single scalar value; a simple, yet effective calibration method for both aleatoric and epistemic uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In all experiments, σ scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at: github.com/mlaves/well-calibrated-regression-uncertainty.",
keywords = "Bayesian approximation, variational inference",
author = "Laves, {Max Heinrich} and Sontje Ihler and Fast, {Jacob F.} and Kahrs, {L{\"u}der A.} and Tobias Ortmaier",
note = "Funding Information: We thank Vincent Modes for his insightful comments. This research has received funding from the European Union as being part of the ERDF OPhonLas project. ; 3rd Conference on Medical Imaging with Deep Learning, MIDL 2020 ; Conference date: 06-07-2020 Through 08-07-2020",
year = "2020",
language = "English",
volume = "121",
pages = "393--412",

}

Download

TY - JOUR

T1 - Well-Calibrated Regression Uncertainty in Medical Imaging with Deep Learning

AU - Laves, Max Heinrich

AU - Ihler, Sontje

AU - Fast, Jacob F.

AU - Kahrs, Lüder A.

AU - Ortmaier, Tobias

N1 - Funding Information: We thank Vincent Modes for his insightful comments. This research has received funding from the European Union as being part of the ERDF OPhonLas project.

PY - 2020

Y1 - 2020

N2 - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of predictive uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show why predictive uncertainty is systematically underestimated. We suggest using σ scaling with a single scalar value; a simple, yet effective calibration method for both aleatoric and epistemic uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In all experiments, σ scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at: github.com/mlaves/well-calibrated-regression-uncertainty.

AB - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of predictive uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show why predictive uncertainty is systematically underestimated. We suggest using σ scaling with a single scalar value; a simple, yet effective calibration method for both aleatoric and epistemic uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In all experiments, σ scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at: github.com/mlaves/well-calibrated-regression-uncertainty.

KW - Bayesian approximation, variational inference

UR - http://www.scopus.com/inward/record.url?scp=85093069704&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85093069704

VL - 121

SP - 393

EP - 412

JO - Proceedings of Machine Learning Research

JF - Proceedings of Machine Learning Research

T2 - 3rd Conference on Medical Imaging with Deep Learning, MIDL 2020

Y2 - 6 July 2020 through 8 July 2020

ER -

By the same author(s)