Background: Commercial-off-the-shelf learning platforms developed for medical education (herein referred to as MedED-COTS) have emerged as a resource used by a majority of medical students to prepare for licensing examinations. As MedED-COTS proliferate and include more functions and features, there is a need for an up-to-date review to inform medical educators on (a) students' use of MedED-COTS outside the formal medical school curriculum, (b) the integration of MedED-COTS into the formal curriculum, and (c) the potential effects of MedED-COTS usage on students' national licensing exam scores in the USA.

Methods: Due to the limited number of studies published on either the use or integration of MedED-COTS, a focused review of literature was conducted to guide future research and practice. Data extraction and quality appraisal were conducted independently by three reviewers; with disagreements resolved by a fourth reviewer. A narrative synthesis was completed to answer research questions, contextualize results, and identify trends and issues in the findings reported by the studies included in the review.

Results: Results revealed consistent positive correlations between students' use of question banks and their licensing exam performance. The limited number of integration studies, combined with a number of methodological issues, makes it impossible to isolate specific effects or associations of integrated commercial resources on standardized test or course outcomes. However, consistent positive correlations, along with students' pervasive use and strong theoretical foundations explaining the results, provide evidence for integrating MedED-COTS into medical school curricula and highlight the need for further research.

Conclusions: Based on findings, we conclude that students use exam preparation materials broadly and they have a positive impact on exam results; the literature on integration of MedED-COTS into formal curriculum and the use by students of resources outside of exam preparation is scant.

Download full-text PDF

Source
http://dx.doi.org/10.1080/0142159X.2022.2039380DOI Listing

Publication Analysis

Top Keywords

licensing exam
12
meded-cots formal
12
integration meded-cots
12
commercial-off-the-shelf learning
8
learning platforms
8
students' national
8
national licensing
8
exam performance
8
focused review
8
meded-cots
8

Similar Publications

Background And Objectives: Medical trainees express difficulty with interpreting statistics in clinical literature. To elucidate educational gaps, we compared statistical methodologies in biomedical literature with biostatistical content in licensing exam study materials.

Methods: In this bibliographic content analysis, we compiled a stratified random sample of articles involving original data analysis published during 2023 in 72 issues of three major medical journals.

View Article and Find Full Text PDF

Comparison of Predictable Ability Measure Using Examinations and Nursing Licensure Success.

Comput Inform Nurs

January 2025

Author Affiliations: Wolters Kluwer (Drs Moran and Terry, Ms Chery, Mrs Madden, and Rightler); and Independent Psychometric Consultant (Dr Viger).

End-of-program predictive examinations have been in existence in nursing education for over 10 years. Nursing schools have used these examinations to prepare students on the testable content from National Council of State Boards of Nursing (NCSBN), which has been delivering the NCLEX-RN since 1994. Nursing students, in the final semester of the nursing program, took the Predictable Ability Measurement Readiness (PAMR) 1 and/or 2.

View Article and Find Full Text PDF

Background: Most states require pharmacists to successfully pass the Multistate Pharmacy Jurisprudence Examination (MPJE) required by the National Association of Boards of Pharmacy (NABP) to obtain licensure as a pharmacist, though pass rates for the MPJE have declined in recent years. Meanwhile, NABP is pursing efforts to standardize the exam with the emergent Uniform Pharmacy Jurisprudence Examination (UPJE).

Objective: This study aimed to describe the current thinking of pharmacy law educators across the US on the UPJE.

View Article and Find Full Text PDF

Medical school exams, like those by the National Board of Medical Examiners (NBME) and the United States Medical Licensing Examination (USMLE), assess essential knowledge and skills for safe patient care, essential for student advancement and securing competitive residencies. Understanding the correlation between exam scores and medical school performance, as well as identifying trends among high scorers, provides valuable insights for both medical students and educators. This review examines the link between study resources and NBME exam scores, as well as psychological factors influencing these outcomes.

View Article and Find Full Text PDF
Article Synopsis
  • This study analyzes the effectiveness of multiple-choice questions (MCQs) in medical education, specifically their ability to assess factual versus conceptual knowledge.
  • Researchers tested a hypothesis that students would score higher on factual recall questions versus conceptual inference questions, and they tracked the retention of this knowledge over two years.
  • Results showed that while both types of questions experienced performance declines over time, the decline was more significant for factual recall questions, indicating that conceptual understanding may retain better long-term among students.
View Article and Find Full Text PDF

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!