A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Peer review comments augment diagnostic error characterization and departmental quality assurance: 1-year experience from a children's hospital. | LitMetric

Objective: The objective of our study was to categorize radiologist peer review comments and evaluate their functions within the context of a comprehensive quality assurance (QA) program.

Materials And Methods: All randomly entered radiology peer review comments at our institution were compiled over a 1-year period (January 1, 2011, through December 31, 2011). A Web-based commercially available software package was used to query the comments, which were then exported into a spreadsheet. Each comment was then placed into a single most appropriate category based on consensus decision of two board-certified pediatric radiologists. QA scores associated with each comment were recorded.

Results: A total of 427 peer review comments were evaluated. The majority of comments (85.9%) were entered voluntarily with QA scores of 1. A classification system was devised that augments traditional error classification. Seven broad comment categories were identified: errors of observation (25.5%), errors of interpretation (5.6%), inadequate patient data gathering (3.7%), errors of communication (9.6%), interobserver variability (21.3%), informational and educational feedback (23.0%), and complimentary (11.2%).

Conclusion: Comment-enhanced peer review expands traditional diagnostic error classification, may identify errors that were underscored, provides continuous educational feedback for participants, and promotes a collegial environment.

Download full-text PDF

Source
http://dx.doi.org/10.2214/AJR.12.9580DOI Listing

Publication Analysis

Top Keywords

peer review
20
review comments
16
diagnostic error
8
quality assurance
8
error classification
8
educational feedback
8
comments
6
peer
5
comments augment
4
augment diagnostic
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!