A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Performance results for a workstation-integrated radiology peer review quality assurance program. | LitMetric

Performance results for a workstation-integrated radiology peer review quality assurance program.

Int J Qual Health Care

Department of Radiology and Diagnostic Imaging, University of Alberta, Edmonton, Alberta, Canada Medical Imaging Consultants, 11010-101 Street, Edmonton, Alberta, Canada T5H 4B9.

Published: June 2016

Objective: To assess review completion rates, RADPEER score distribution, and sources of disagreement when using a workstation-integrated radiology peer review program, and to evaluate radiologist perceptions of the program.

Design: Retrospective review of prospectively collected data.

Setting: Large private outpatient radiology practice.

Participants: Radiologists (n = 66) with a mean of 16.0 (standard deviation, 9.2) years of experience.

Interventions: Prior studies and reports of cases being actively reported were randomly selected for peer review using the RADPEER scoring system (a 4-point scale, with a score of 1 indicating agreement and scores of 2-4 indicating increasing levels of disagreement).

Main Outcome Measures: Assigned peer review completion rates, review scores, sources of disagreement and radiologist survey responses.

Results: Of 31 293 assigned cases, 29 044 (92.8%; 95% CI 92.5-93.1%) were reviewed. Discrepant scores (score = 2, 3 or 4) were given in 0.69% (95% CI 0.60-0.79%) of cases and clinically significant discrepancy (score = 3 or 4) was assigned in 0.42% (95% CI 0.35-0.50%). The most common cause of disagreement was missed diagnosis (75.2%; 95% CI 66.8-82.1%). By anonymous survey, 94% of radiologists felt that peer review was worthwhile, 90% reported that the scores they received were appropriate and 78% felt that the received feedback was valuable.

Conclusion: Workstation-based peer review can increase completion rates and levels of radiologist acceptance while producing RADPEER scores similar to those previously reported. This approach may be one way to increase radiologist engagement in peer review quality assurance.

Download full-text PDF

Source
http://dx.doi.org/10.1093/intqhc/mzw017DOI Listing

Publication Analysis

Top Keywords

peer review
28
completion rates
12
review
10
workstation-integrated radiology
8
radiology peer
8
review quality
8
quality assurance
8
review completion
8
sources disagreement
8
peer
7

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!