Details
Originalsprache | Englisch |
---|---|
Seiten (von - bis) | 1-26 |
Seitenumfang | 26 |
Fachzeitschrift | The Journal of Machine Learning for Biomedical Imaging (MELBA) |
Jahrgang | MIDL 2020 |
Ausgabenummer | 1 |
Publikationsstatus | Veröffentlicht - 28 Apr. 2021 |
Abstract
Zitieren
- Standard
- Harvard
- Apa
- Vancouver
- BibTex
- RIS
in: The Journal of Machine Learning for Biomedical Imaging (MELBA), Jahrgang MIDL 2020, Nr. 1, 28.04.2021, S. 1-26.
Publikation: Beitrag in Fachzeitschrift › Artikel › Forschung › Peer-Review
}
TY - JOUR
T1 - Recalibration of Aleatoric and Epistemic Regression Uncertainty in Medical Imaging
AU - Laves, Max-Heinrich
AU - Ihler, Sontje
AU - Fast, Jacob F.
AU - Kahrs, Lüder A.
AU - Ortmaier, Tobias
PY - 2021/4/28
Y1 - 2021/4/28
N2 - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty
AB - The consideration of predictive uncertainty in medical imaging with deep learning is of utmost importance. We apply estimation of both aleatoric and epistemic uncertainty by variational Bayesian inference with Monte Carlo dropout to regression tasks and show that predictive uncertainty is systematically underestimated. We apply \( \sigma \) scaling with a single scalar value; a simple, yet effective calibration method for both types of uncertainty. The performance of our approach is evaluated on a variety of common medical regression data sets using different state-of-the-art convolutional network architectures. In our experiments, \( \sigma \) scaling is able to reliably recalibrate predictive uncertainty. It is easy to implement and maintains the accuracy. Well-calibrated uncertainty in regression allows robust rejection of unreliable predictions or detection of out-of-distribution samples. Our source code is available at https://github.com/mlaves/well-calibrated-regression-uncertainty
KW - eess.IV
KW - cs.CV
M3 - Article
VL - MIDL 2020
SP - 1
EP - 26
JO - The Journal of Machine Learning for Biomedical Imaging (MELBA)
JF - The Journal of Machine Learning for Biomedical Imaging (MELBA)
SN - 2766-905X
IS - 1
ER -