Details
Original language | English |
---|---|
Pages (from-to) | 1-26 |
Number of pages | 26 |
Journal | The Journal of Machine Learning for Biomedical Imaging (MELBA) |
Volume | MIDL 2020 |
Issue number | 1 |
Publication status | Published - 28 Apr 2021 |
Abstract
Keywords
- eess.IV, cs.CV
Cite this
- Standard
- Harvard
- Apa
- Vancouver
- BibTeX
- RIS
In: The Journal of Machine Learning for Biomedical Imaging (MELBA), Vol. MIDL 2020, No. 1, 28.04.2021, p. 1-26.
Research output: Contribution to journal › Article › Research › peer review
}
TY - JOUR
T1 - Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging
AU - Laves, Max-Heinrich
AU - Ihler, Sontje
AU - Fast, Jacob F.
AU - Kahrs, Lüder A.
AU - Ortmaier, Tobias
PY - 2021/4/28
Y1 - 2021/4/28
N2 - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty
AB - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty
KW - eess.IV
KW - cs.CV
M3 - Article
VL - MIDL 2020
SP - 1
EP - 26
JO - The Journal of Machine Learning for Biomedical Imaging (MELBA)
JF - The Journal of Machine Learning for Biomedical Imaging (MELBA)
SN - 2766-905X
IS - 1
ER -