A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Interobserver Reliability of the Coronary Artery Disease Reporting and Data System in Clinical Practice. | LitMetric

Purpose: This study aimed to evaluate interobserver reproducibility between cardiothoracic radiologists applying the Coronary Artery Disease Reporting and Data System (CAD-RADS) to describe atherosclerotic burden on coronary computed tomography angiography.

Methods: Forty clinical computed tomography angiography cases were retrospectively and independently evaluated by 3 attending and 2 fellowship-trained cardiothoracic radiologists using the CAD-RADS lexicon. Radiologists were blinded to patient history and underwent initial training using a practice set of 10 subjects. Interobserver reproducibility was assessed using an intraclass correlation (ICC) on the basis of single-observer scores, absolute agreement, and a 2-way random-effects model. Nondiagnostic studies were excluded. ICC was also performed for CAD-RADS scores grouped by management recommendations for absent (0), nonobstructive (1 to 2), and potentially obstructive (3 to 5) CAD.

Results: Interobserver reproducibility was moderate to good (ICC: 0.748, 95% confidence interval [CI]: 0.639-0.842, P<0.0001), with higher agreement among cardiothoracic radiology fellows (ICC: 0.853, 95% CI: 0.730-0.922, P<0.0001) than attending radiologists (ICC: 0.711, 95% CI: 0.568-0.824, P<0.0001). Interobserver reproducibility for clinical management categories was marginally decreased (ICC: 0.692, 95% CI: 0.570-0.802, P<0.0001). The average percent agreement between pairs of radiologists was 84.74%. Percent observer agreement was significantly reduced in the presence (M=62.22%, SD=15.17%) versus the absence (M=80.91%, SD=17.97%) of modifiers, t(37.95)=3.566, P=0.001.

Conclusions: Interobserver reliability and agreement with the CAD-RADS terminology are moderate to good in clinical practice. However, further investigations are needed to characterize the causes of interobserver disagreement that may lead to differences in management recommendations.

Download full-text PDF

Source
http://dx.doi.org/10.1097/RTI.0000000000000503DOI Listing

Publication Analysis

Top Keywords

interobserver reproducibility
12
coronary artery
8
artery disease
8
disease reporting
8
reporting data
8
data system
8
cardiothoracic radiologists
8
computed tomography
8
interobserver
4
interobserver reliability
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!