Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

A Comprehensive Framework for Uncertainty Quantification of Voxel-wise Supervised Models in IVIM MRI

Created by
  • Haebom

Author

Nicola Casali, Alessandro Brusaferri, Giuseppe Baselli, Stefano Fumagalli, Edoardo Micotti, Gianluigi Forloni, Riaz Hussein, Giovanna Rizzo, Alfonso Mastropietro

Outline

This paper highlights the difficulty of accurately estimating intravoxel incoherent motion (IVIM) parameters in diffusion-weighted MRI (DMR) due to the instability of the inverse problem and its high sensitivity to noise, especially in the perfusion compartment. To address this, we propose a probabilistic deep learning framework based on deep ensembles (DE) of Mixture Density Networks (MDNs). This framework estimates the overall prediction uncertainty and decomposes it into alleatoric uncertainty (AU) and epistemic uncertainty (EU). The proposed method is evaluated against non-probabilistic neural networks, Bayesian fitting methods, and probabilistic networks with single Gaussian parameterization. Supervised learning is performed using synthetic data, and evaluations are performed on simulated and in vivo datasets. The reliability of the quantified uncertainty is assessed using calibration curves, sharpness of the output distribution, and continuous rank probability scores (CRPS). MDNs produced more corrected and sharper predictive distributions for the diffusion coefficient D and fraction f parameters, but slight overconfidence was observed for the pseudodiffusion coefficient D . The robust coefficient variability (RCV) indicated smoother in vivo estimates of D using MDNs compared to the Gaussian model. Despite training data covering the expected physiological range, the high in vivo EU suggests a discrepancy with actual acquisition conditions and highlights the importance of EU integration allowed by DE. Overall, this paper presents a comprehensive framework for IVIM fitting via uncertainty quantification, which allows for the identification and interpretation of unreliable estimates. The proposed approach can also be applied to other physical model fitting with appropriate architecture and simulation adjustments.

Takeaways, Limitations

Takeaways:
We present a probabilistic deep learning framework for quantifying uncertainty in IVIM parameter estimation.
Decomposing allometric and epistemic uncertainty to identify and interpret unreliable estimates.
More accurate and smooth IVIM parameter estimation using MDNs.
Provides a general framework applicable to fitting a variety of physical models.
Limitations:
Observation of a slight overconfidence phenomenon in the pseudo-diffusion coefficient D*.
High epistemic uncertainty in in vivo data suggests discrepancies with actual acquisition conditions and requires future improvements.
Since it was trained on synthetic data, further verification of generalization performance on real data is required.
👍