A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Creation and implementation of a national emergency medicine fourth-year student examination. | LitMetric

Creation and implementation of a national emergency medicine fourth-year student examination.

J Emerg Med

Department of Emergency Medicine, Harvard Medical School, Massachusetts General Hospital, Boston, Massachusetts.

Published: December 2013

Background: A National Board of Medical Examiners examination does not exist for Emergency Medicine (EM) students. To fill this void, the Clerkship Directors in Emergency Medicine tasked a committee with development of an examination for 4th-year (M4) EM students, based on a published syllabus, and consisting of questions written according to published question-writing guidelines.

Study Objectives: Describe examination development and statistics at 9 months.

Methods: The committee reviewed an existing EM student question database at www.saemtests.org for statistical performance, compliance with item-writing guidelines, and topic inclusion within the published EM M4 syllabus. For syllabus topics without existing questions, committee members wrote new items. LXR 6.0 software (Applied Measurement Professionals, Inc., Georgetown, SC) was used for examination administration. Data gathered included numbers of examinations completed, mean scores with SD, and point biserial correlation (rpb).

Results: Of the 553 questions assessed, 157 questions met the stated criteria, and 37 were included in the examination. Thirteen new questions were written by committee members to cover all curriculum topics. The National EM M4 Examination was released online August 1, 2011. Nine months later, the examination had been completed 1642 times by students from 27 clerkships. Mean score was 79.69% (SD 3.89). Individual question difficulties ranged from 26% to 99%. Question rpbs ranged from 0.067 to 0.353, mean 0.213 (SD 0.066).

Conclusions: A national group of EM educators developed an examination to assess a published clerkship syllabus. The examination contains questions written according to published item-writing guidelines, and exhibits content validity, appropriate difficulty levels, and adequate question discriminatory ability.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jemermed.2013.05.051DOI Listing

Publication Analysis

Top Keywords

emergency medicine
12
questions written
12
examination
10
published syllabus
8
written published
8
item-writing guidelines
8
committee members
8
questions
6
published
5
creation implementation
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!