A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Checking the checklist: a content analysis of expert- and evidence-based case-specific checklist items. | LitMetric

Checking the checklist: a content analysis of expert- and evidence-based case-specific checklist items.

Med Educ

Radboud University Nijmegen Medical Centre, Academic Educational Institute, Nijmegen, the NetherlandsBehavioural Science Institute, Radboud University Nijmegen, Nijmegen, the NetherlandsDepartment of General Internal Medicine and Academic Educational Institute, Radboud University Nijmegen Medical Centre, Nijmegen, the Netherlands.

Published: September 2010

Objectives: Research on objective structured clinical examinations (OSCEs) is extensive. However, relatively little has been written on the development of case-specific checklists on history taking and physical examination. Background information on the development of these checklists is a key element of the assessment of their content validity. Usually, expert panels are involved in the development of checklists. The objective of this study is to compare expert-based items on OSCE checklists with evidence-based items identified in the literature.

Methods: Evidence-based items covering both history taking and physical examination for specific clinical problems and diseases were identified in the literature. Items on nine expert-based checklists for OSCE examination stations were evaluated by comparing them with items identified in the literature. The data were grouped into three categories: (i) expert-based items; (ii) evidence-based items, and (iii) evidence-based items with a specific measure of their relevance.

Results: Out of 227 expert-based items, 58 (26%) were not found in the literature. Of 388 evidence-based items found in the literature, 219 (56%) were not included in the expert-based checklists. Of these 219 items, 82 (37%) had a specific measure of importance, such as an odds ratio for a diagnosis, making that diagnosis more or less probable.

Conclusions: Expert-based, case-specific checklist items developed for OSCE stations do not coincide with evidence-based items identified in the literature. Further research is needed to ascertain what this inconsistency means for test validity.

Download full-text PDF

Source
http://dx.doi.org/10.1111/j.1365-2923.2010.03721.xDOI Listing

Publication Analysis

Top Keywords

evidence-based items
24
items
14
expert-based items
12
items identified
12
identified literature
12
case-specific checklist
8
checklist items
8
history physical
8
physical examination
8
development checklists
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!