The ability to detect anomalies, i.e. anything not seen during training or out-of-distribution (OOD), in medical imaging applications is essential for successfully deploying machine learning systems. Filtering out OOD data using unsupervised learning is especially promising because it does not require costly annotations. A new class of models called AnoDDPMs, based on denoising diffusion probabilistic models (DDPMs), has recently achieved significant progress in unsupervised OOD detection. This work provides a benchmark for unsupervised OOD detection methods in digital pathology. By leveraging fast sampling techniques, we apply AnoDDPM on a large enough scale for whole-slide image analysis on the complete test set of the Camelyon16 challenge. Based on ROC analysis, we show that AnoDDPMs can detect OOD data with an AUC of up to 94.13 and 86.93 on two patch-level OOD detection tasks, outperforming the other unsupervised methods. We observe that AnoDDPMs alter the semantic properties of inputs, replacing anomalous data with more benign-looking tissue. Furthermore, we highlight the flexibility of AnoDDPM towards different information bottlenecks by evaluating reconstruction errors for inputs with different signal-to-noise ratios. While there is still a significant performance gap with fully supervised learning, AnoDDPMs show considerable promise in the field of OOD detection in digital pathology.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1016/j.media.2024.103088 | DOI Listing |
Biol Methods Protoc
January 2025
Department of Physics, George Washington University, Washington, DC 20052, United States.
A mixture-of-experts (MoE) approach has been developed to mitigate the poor out-of-distribution (OOD) generalization of deep learning (DL) models for single-sequence-based prediction of RNA secondary structure. The main idea behind this approach is to use DL models for in-distribution (ID) test sequences to leverage their superior ID performances, while relying on physics-based models for OOD sequences to ensure robust predictions. One key ingredient of the pipeline, named MoEFold2D, is automated ID/OOD detection via consensus analysis of an ensemble of DL model predictions without requiring access to training data during inference.
View Article and Find Full Text PDFNat Commun
January 2025
Department of Chemistry, Theoretical Chemistry Institute, University of Wisconsin-Madison, Madison, WI, 53706, USA.
Identifying transitional states is crucial for understanding protein conformational changes that underlie numerous biological processes. Markov state models (MSMs), built from Molecular Dynamics (MD) simulations, capture these dynamics through transitions among metastable conformational states, and have demonstrated success in studying protein conformational changes. However, MSMs face challenges in identifying transition states, as they partition MD conformations into discrete metastable states (or free energy minima), lacking description of transition states located at the free energy barriers.
View Article and Find Full Text PDFComput Biol Med
December 2024
Diagnostic Imaging Analysis Group, Medical Imaging Department, Radboud University Medical Center, Geert Grooteplein Zuid 10, 6525 GA, Nijmegen, the Netherlands.
Artificial Intelligence (AI) models may fail or suffer from reduced performance when applied to unseen data that differs from the training data distribution, referred to as dataset shift. Automatic detection of out-of-distribution (OOD) data contributes to safe and reliable clinical implementation of AI models. In this study, we propose a recognized OOD detection method that utilizes the Mahalanobis distance (MD) and compare its performance to widely known classical methods.
View Article and Find Full Text PDFInt J Med Inform
December 2024
Department of Medical Informatics, Amsterdam Public Health Research Institute, Amsterdam UMC, University of Amsterdam, the Netherlands; Institute of Logic, Language and Computation, University of Amsterdam, the Netherlands; Pacmed, Amsterdam, the Netherlands. Electronic address:
Background: Machine Learning (ML) models often struggle to generalize effectively to data that deviates from the training distribution. This raises significant concerns about the reliability of real-world healthcare systems encountering such inputs known as out-of-distribution (OOD) data. These concerns can be addressed by real-time detection of OOD inputs.
View Article and Find Full Text PDFComput Med Imaging Graph
January 2025
ICMUB, Université de Bourgogne, Dijon, France. Electronic address:
In real-world scenarios, medical image segmentation models encounter input images that may deviate from the training images in various ways. These differences can arise from changes in image scanners and acquisition protocols, or even the images can come from a different modality or domain. When the model encounters these out-of-distribution (OOD) images, it can behave unpredictably.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!