A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Interactive Explainable Deep Learning Model Informs Prostate Cancer Diagnosis at MRI. | LitMetric

Interactive Explainable Deep Learning Model Informs Prostate Cancer Diagnosis at MRI.

Radiology

From the Department of Radiology, Charité-Universitätsmedizin Berlin, Corporate Member of Freie Universität Berlin, Humboldt-Universität zu Berlin, and Berlin Institute of Health, Campus Virchow Klinikum, Augustenburgerplatz 1, 13353 Berlin, Germany (C.A.H., G.L.B., N.L.B., A.H., L.J.S., K.F., F.D., M.R., A.D.J.B., B.H., M.H., S.H., T.P.); Berlin Institute of Health (BIH), Berlin, Germany (C.A.H., N.L.B., L.J.S., T.P.); Faculty VI-Informatics and Media, Berliner Hochschule für Technik (BHT), Einstein Center Digital Future, Berlin, Germany (G.L.B., F.B.); Bayer AG, Medical Affairs and Pharmacovigilance, Integrated Evidence Generation & Business Innovation, Berlin, Germany (A.H.); Institute of Pathology, Charité - Universitätsmedizin Berlin, corporate member of Freie Universität Berlin, Humboldt-Universität zu Berlin and Berlin Institute of Health, Berlin, Germany (S.S.); and Department of Urology, Otto-von-Guericke-University Magdeburg, Germany and PROURO, Berlin, Germany (H.C.).

Published: May 2023

Background Clinically significant prostate cancer (PCa) diagnosis at MRI requires accurate and efficient radiologic interpretation. Although artificial intelligence may assist in this task, lack of transparency has limited clinical translation. Purpose To develop an explainable artificial intelligence (XAI) model for clinically significant PCa diagnosis at biparametric MRI using Prostate Imaging Reporting and Data System (PI-RADS) features for classification justification. Materials and Methods This retrospective study included consecutive patients with histopathologic analysis-proven prostatic lesions who underwent biparametric MRI and biopsy between January 2012 and December 2017. After image annotation by two radiologists, a deep learning model was trained to detect the index lesion; classify PCa, clinically significant PCa (Gleason score ≥ 7), and benign lesions (eg, prostatitis); and justify classifications using PI-RADS features. Lesion- and patient-based performance were assessed using fivefold cross validation and areas under the receiver operating characteristic curve. Clinical feasibility was tested in a multireader study and by using the external PROSTATEx data set. Statistical evaluation of the multireader study included Mann-Whitney and exact Fisher-Yates test. Results Overall, 1224 men (median age, 67 years; IQR, 62-73 years) had 3260 prostatic lesions (372 lesions with Gleason score of 6; 743 lesions with Gleason score of ≥ 7; 2145 benign lesions). XAI reliably detected clinically significant PCa in internal (area under the receiver operating characteristic curve, 0.89) and external test sets (area under the receiver operating characteristic curve, 0.87) with a sensitivity of 93% (95% CI: 87, 98) and an average of one false-positive finding per patient. Accuracy of the visual and textual explanations of XAI classifications was 80% (1080 of 1352), confirmed by experts. XAI-assisted readings improved the confidence (4.1 vs 3.4 on a five-point Likert scale; = .007) of nonexperts in assessing PI-RADS 3 lesions, reducing reading time by 58 seconds ( = .009). Conclusion The explainable AI model reliably detected and classified clinically significant prostate cancer and improved the confidence and reading time of nonexperts while providing visual and textual explanations using well-established imaging features. © RSNA, 2023 See also the editorial by Chapiro in this issue.

Download full-text PDF

Source
http://dx.doi.org/10.1148/radiol.222276DOI Listing

Publication Analysis

Top Keywords

prostate cancer
12
clinically pca
12
gleason score
12
receiver operating
12
operating characteristic
12
characteristic curve
12
deep learning
8
learning model
8
diagnosis mri
8
clinically prostate
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!