Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Background Clinically significant prostate cancer (PCa) diagnosis at MRI requires accurate and efficient radiologic interpretation. Although artificial intelligence may assist in this task, lack of transparency has limited clinical translation. Purpose To develop an explainable artificial intelligence (XAI) model for clinically significant PCa diagnosis at biparametric MRI using Prostate Imaging Reporting and Data System (PI-RADS) features for classification justification. Materials and Methods This retrospective study included consecutive patients with histopathologic analysis-proven prostatic lesions who underwent biparametric MRI and biopsy between January 2012 and December 2017. After image annotation by two radiologists, a deep learning model was trained to detect the index lesion; classify PCa, clinically significant PCa (Gleason score ≥ 7), and benign lesions (eg, prostatitis); and justify classifications using PI-RADS features. Lesion- and patient-based performance were assessed using fivefold cross validation and areas under the receiver operating characteristic curve. Clinical feasibility was tested in a multireader study and by using the external PROSTATEx data set. Statistical evaluation of the multireader study included Mann-Whitney and exact Fisher-Yates test. Results Overall, 1224 men (median age, 67 years; IQR, 62-73 years) had 3260 prostatic lesions (372 lesions with Gleason score of 6; 743 lesions with Gleason score of ≥ 7; 2145 benign lesions). XAI reliably detected clinically significant PCa in internal (area under the receiver operating characteristic curve, 0.89) and external test sets (area under the receiver operating characteristic curve, 0.87) with a sensitivity of 93% (95% CI: 87, 98) and an average of one false-positive finding per patient. Accuracy of the visual and textual explanations of XAI classifications was 80% (1080 of 1352), confirmed by experts. XAI-assisted readings improved the confidence (4.1 vs 3.4 on a five-point Likert scale; = .007) of nonexperts in assessing PI-RADS 3 lesions, reducing reading time by 58 seconds ( = .009). Conclusion The explainable AI model reliably detected and classified clinically significant prostate cancer and improved the confidence and reading time of nonexperts while providing visual and textual explanations using well-established imaging features. © RSNA, 2023 See also the editorial by Chapiro in this issue.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1148/radiol.222276 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!