A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

The quality of in-house medical school examinations. | LitMetric

Purpose: Most medical schools test their students throughout the curriculum using in-house examinations written by the faculty who teach the courses. The authors assessed the quality of in-house examinations used in three U.S. medical schools.

Method: In 1998, nine basic science examinations from the three schools were gathered and each question was subjected to quality assessment by three expert biomedical test developers, each of whom has had extensive experience in reviewing and evaluating questions for the United States Medical Licensing Examination (USMLE) Steps 1 and 2. Questions were rated on a five-point scale: 1 = tested recall only and was technically flawed to 5 = used a clinical or laboratory vignette, required reasoning to answer, and was free of technical flaws. Each rater made independent assessments, and the mean score for each question was calculated. Mean quality scores for National Board of Medical Examiners (NBME) who were trained question writers were compared with the mean scores for question writers without NBME training. The raters' quality assessments were made without knowledge of the test writers' training background or the study's hypothesis.

Results: A total of 555 questions were analyzed. The mean score for all questions was 2.39 +/- 1.21. The 92 questions written by NBME-trained question writers had a mean score of 4.24 +/- 0.85, and the 463 questions written by faculty without formal NBME training had a mean score of 2.03 +/- 0.90 (p <.01).

Conclusions: The in-house examinations were of relatively low quality. The quality of examination questions can be significantly improved by providing question writers with formal training.

Download full-text PDF

Source
http://dx.doi.org/10.1097/00001888-200202000-00016DOI Listing

Publication Analysis

Top Keywords

question writers
12
quality in-house
8
in-house examinations
8
written faculty
8
examinations three
8
nbme training
8
questions written
8
questions
6
quality
5
medical
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!