Background: Commercial-off-the-shelf learning platforms developed for medical education (herein referred to as MedED-COTS) have emerged as a resource used by a majority of medical students to prepare for licensing examinations. As MedED-COTS proliferate and include more functions and features, there is a need for an up-to-date review to inform medical educators on (a) students' use of MedED-COTS outside the formal medical school curriculum, (b) the integration of MedED-COTS into the formal curriculum, and (c) the potential effects of MedED-COTS usage on students' national licensing exam scores in the USA.
Methods: Due to the limited number of studies published on either the use or integration of MedED-COTS, a focused review of literature was conducted to guide future research and practice. Data extraction and quality appraisal were conducted independently by three reviewers; with disagreements resolved by a fourth reviewer. A narrative synthesis was completed to answer research questions, contextualize results, and identify trends and issues in the findings reported by the studies included in the review.
Results: Results revealed consistent positive correlations between students' use of question banks and their licensing exam performance. The limited number of integration studies, combined with a number of methodological issues, makes it impossible to isolate specific effects or associations of integrated commercial resources on standardized test or course outcomes. However, consistent positive correlations, along with students' pervasive use and strong theoretical foundations explaining the results, provide evidence for integrating MedED-COTS into medical school curricula and highlight the need for further research.
Conclusions: Based on findings, we conclude that students use exam preparation materials broadly and they have a positive impact on exam results; the literature on integration of MedED-COTS into formal curriculum and the use by students of resources outside of exam preparation is scant.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/0142159X.2022.2039380 | DOI Listing |
Fam Med
December 2024
Department of Academic Affairs, Brody School of Medicine, East Carolina University, Greenville, NC.
Background And Objectives: Medical trainees express difficulty with interpreting statistics in clinical literature. To elucidate educational gaps, we compared statistical methodologies in biomedical literature with biostatistical content in licensing exam study materials.
Methods: In this bibliographic content analysis, we compiled a stratified random sample of articles involving original data analysis published during 2023 in 72 issues of three major medical journals.
Comput Inform Nurs
January 2025
Author Affiliations: Wolters Kluwer (Drs Moran and Terry, Ms Chery, Mrs Madden, and Rightler); and Independent Psychometric Consultant (Dr Viger).
End-of-program predictive examinations have been in existence in nursing education for over 10 years. Nursing schools have used these examinations to prepare students on the testable content from National Council of State Boards of Nursing (NCSBN), which has been delivering the NCLEX-RN since 1994. Nursing students, in the final semester of the nursing program, took the Predictable Ability Measurement Readiness (PAMR) 1 and/or 2.
View Article and Find Full Text PDFCurr Pharm Teach Learn
December 2024
University of Colorado Skaggs School of Pharmacy and Pharmaceutical Sciences, United States of America. Electronic address:
Background: Most states require pharmacists to successfully pass the Multistate Pharmacy Jurisprudence Examination (MPJE) required by the National Association of Boards of Pharmacy (NABP) to obtain licensure as a pharmacist, though pass rates for the MPJE have declined in recent years. Meanwhile, NABP is pursing efforts to standardize the exam with the emergent Uniform Pharmacy Jurisprudence Examination (UPJE).
Objective: This study aimed to describe the current thinking of pharmacy law educators across the US on the UPJE.
Cureus
November 2024
Department of Medical Education, Nova Southeastern University Dr. Kiran C. Patel College of Allopathic Medicine, Fort Lauderdale, USA.
Medical school exams, like those by the National Board of Medical Examiners (NBME) and the United States Medical Licensing Examination (USMLE), assess essential knowledge and skills for safe patient care, essential for student advancement and securing competitive residencies. Understanding the correlation between exam scores and medical school performance, as well as identifying trends among high scorers, provides valuable insights for both medical students and educators. This review examines the link between study resources and NBME exam scores, as well as psychological factors influencing these outcomes.
View Article and Find Full Text PDFAdv Med Educ Pract
December 2024
Department of Medical Education, Kirk Kerkorian School of Medicine at UNLV, Las Vegas, NV, USA.
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!