A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Assessment methods and resource requirements for milestone reporting by an emergency medicine clinical competency committee. | LitMetric

Background: The Accreditation Council for Graduate Medical Education (ACGME) introduced milestones for Emergency Medicine (EM) in 2012. Clinical Competency Committees (CCC) are tasked with assessing residents on milestones and reporting them to the ACGME. Appropriate workflows for CCCs are not well defined.

Objective: Our objective was to compare different approaches to milestone assessment by a CCC, quantify resource requirements for each and to identify the most efficient workflow.

Design: Three distinct processes for rendering milestone assessments were compared: Full milestone assessments (FMA) utilizing all available resident assessment data, Ad-hoc milestone assessments (AMA) created by multiple expert educators using their personal assessment of resident performance, Self-assessments (SMA) completed by residents. FMA were selected as the theoretical gold standard. Intraclass correlation coefficients were used to analyze for agreement between different assessment methods. Kendall's coefficient was used to assess the inter-rater agreement for the AMA.

Results: All 13 second-year residents and 7 educational faculty of an urban EM Residency Program participated in the study in 2013. Substantial or better agreement between FMA and AMA was seen for 8 of the 23 total subcompetencies (PC4, PC8, PC9, PC11, MK, PROF2, ICS2, SBP2), and for 1 subcompetency (SBP1) between FMA and SMA. Multiple AMA for individual residents demonstrated substantial or better interobserver agreement in 3 subcompetencies (PC1, PC2, and PROF2). FMA took longer to complete compared to AMA (80.9 vs. 5.3 min, p < 0.001).

Conclusions: Using AMA to evaluate residents on the milestones takes significantly less time than FMA. However, AMA and SMA agree with FMA on only 8 and 1 subcompetencies, respectively. An estimated 23.5 h of faculty time are required each month to fulfill the requirement for semiannual reporting for a residency with 42 trainees.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6211216PMC
http://dx.doi.org/10.1080/10872981.2018.1538925DOI Listing

Publication Analysis

Top Keywords

milestone assessments
12
assessment methods
8
resource requirements
8
emergency medicine
8
clinical competency
8
substantial better
8
assessment
5
milestone
5
fma
5
methods resource
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!