The rapid evolution of artificial intelligence (AI), particularly in deep learning, has significantly impacted radiology, introducing an array of AI solutions for interpretative tasks. This paper provides radiology departments with a practical guide for selecting and integrating AI solutions, focusing on interpretative tasks that require the active involvement of radiologists. Our approach is not to list available applications or review scientific evidence, as this information is readily available in previous studies; instead, we concentrate on the essential factors radiology departments must consider when choosing AI solutions.
View Article and Find Full Text PDFBackground: Explainable Artificial Intelligence (XAI) is prominent in the diagnostics of opaque deep learning (DL) models, especially in medical imaging. Saliency methods are commonly used, yet there's a lack of quantitative evidence regarding their performance.
Objectives: To quantitatively evaluate the performance of widely utilized saliency XAI methods in the task of breast cancer detection on mammograms.
Machine learning (ML) models have become capable of making critical decisions on our behalf. Nevertheless, due to complexity of these models, interpreting their decisions can be challenging, and humans cannot always control them. This paper provides explanations of decisions made by ML models in diagnosing four types of posterior fossa tumors: medulloblastoma, ependymoma, pilocytic astrocytoma, and brainstem glioma.
View Article and Find Full Text PDF