J Grad Med Educ
December 2024
Thomas J. Nasca, MD, MACP, served as the President and Chief Executive Officer (CEO) of the Accreditation Council for Graduate Medical Education (ACGME) for 17 years, with his tenure ending December 2024. During this time he led and supported significant changes in accreditation and medical education.
View Article and Find Full Text PDFAlthough Clinical Competency Committees (CCCs) were implemented to facilitate the goals of competency-based medical education, implementation has been variable, and we do not know if and how these committees affected programs and assessment in graduate medical education (GME). To explore the roles CCCs fulfill in GME and their effect on trainees, faculty, and programs. We conducted a narrative review of CCC primary research with the following inclusion criteria: all articles must be research in nature, focused on GME and specifically studying CCCs, and published in English language journals from January 2013 to November 2022.
View Article and Find Full Text PDFPurpose: Accelerated 3-year programs (A3YPs) at medical schools were developed to address student debt and mitigate workforce shortage issues. This study investigated whether medical school length (3 vs 4 years) was associated with early residency performance. The primary research question was as follows: Are the Accreditation Council for Graduate Medical Education Milestones (MS) attained by A3YP graduates comparable to graduates of traditional 4-year programs (T4YPs) at 6 and 12 months into internship?
Method: The MS data from students entering U.
Objective: To establish whether Accreditation Council for Graduate Medical Education Milestones predict future performance of general surgery trainees.
Summary Background Data: Milestones provide bi-annual assessments of trainee progress across six competencies. It is unknown whether the Milestones predict surgeon performance after the transition to independent practice.
Importance: National data on the development of competence during training have been reported using the Accreditation Council for Graduate Medical Education (ACGME) Milestones system. It is now possible to consider longitudinal analyses that link Milestone ratings during training to patient outcomes data of recent graduates.
Objective: To evaluate the association of in-training ACGME Milestone ratings in a surgical specialty with subsequent complication rates following a commonly performed operation, endovascular aortic aneurysm repair (EVAR).
Changes in digital technology, increasing volume of data collection, and advances in methods have the potential to unleash the value of big data generated through the education of health professionals. Coupled with this potential are legitimate concerns about how data can be used or misused in ways that limit autonomy, equity, or harm stakeholders. This consensus statement is intended to address these issues by foregrounding the ethical imperatives for engaging with big data as well as the potential risks and challenges.
View Article and Find Full Text PDFAlthough entrustment-supervision ratings are more intuitive compared to other rating scales, it is not known whether their use accurately assesses the appropriateness of care provided by a resident. To determine the frequency of incorrect entrustment ratings assigned by faculty and whether accuracy of an entrustment-supervision scale differed by resident performance when the scripted resident performance level is known. Faculty participants rated standardized residents in 10 videos using a 4-point entrustment-supervision scale.
View Article and Find Full Text PDFBackground: While some prior studies of work-based assessment (WBA) numeric ratings have not shown gender differences, they have been unable to account for the true performance of the resident or explore narrative differences by gender.
Objective: To explore gender differences in WBA ratings as well as narrative comments (when scripted performance was known).
Design: Secondary analysis of WBAs obtained from a randomized controlled trial of a longitudinal rater training intervention in 2018-2019.
Systems-based practice (SBP) was introduced as 1 of 6 core competencies in 1999 because of its recognized importance in the quality and safety of health care provided to patients. Nearly 25 years later, faculty and learners continue to struggle with understanding and implementing this essential competency, thus hindering the medical education community's ability to most effectively teach and learn this important competency.Milestones were first introduced in 2013 as one effort to support implementation of the general competencies.
View Article and Find Full Text PDFThe complexity of improving health in the United States and the rising call for outcomes-based physician training present unique challenges and opportunities for both graduate medical education (GME) and health systems. GME programs have been particularly challenged to implement systems-based practice (SBP) as a core physician competency and educational outcome. Disparate definitions and educational approaches to SBP, as well as limited understanding of the complex interactions between GME trainees, programs, and their health system settings, contribute to current suboptimal educational outcomes elated to SBP.
View Article and Find Full Text PDFAssessment is essential to professional development. Assessment provides the information needed to give feedback, support coaching and the creation of individualized learning plans, inform progress decisions, determine appropriate supervision levels, and, most importantly, help ensure patients and families receive high-quality, safe care in the training environment. While the introduction of competency-based medical education has catalyzed advances in assessment, much work remains to be done.
View Article and Find Full Text PDFBackground: Workplace-based assessment (WBA) is a key assessment strategy in competency-based medical education. However, its full potential has not been actualized secondary to concerns with reliability, validity, and accuracy. Frame of reference training (FORT), a rater training technique that helps assessors distinguish between learner performance levels, can improve the accuracy and reliability of WBA, but the effect size is variable.
View Article and Find Full Text PDFPurpose: Accurate assessment of clinical performance is essential to ensure graduating residents are competent for unsupervised practice. The Accreditation Council for Graduate Medical Education milestones framework is the most widely used competency-based framework in the United States. However, the relationship between residents' milestones competency ratings and their subsequent early career clinical outcomes has not been established.
View Article and Find Full Text PDFImportance: Previous studies have demonstrated racial and ethnic inequities in medical student assessments, awards, and faculty promotions at academic medical centers. Few data exist about similar racial and ethnic disparities at the level of graduate medical education.
Objective: To examine the association between race and ethnicity and performance assessments among a national cohort of internal medicine residents.
Purpose: Prior research evaluating workplace-based assessment (WBA) rater training effectiveness has not measured improvement in narrative comment quality and accuracy, nor accuracy of prospective entrustment-supervision ratings. The purpose of this study was to determine whether rater training, using performance dimension and frame of reference training, could improve WBA narrative comment quality and accuracy. A secondary aim was to assess impact on entrustment rating accuracy.
View Article and Find Full Text PDFBackground: The COVID-19 pandemic has affected every facet of American health care, including graduate medical education (GME). Prior studies show that COVID-19 resulted in reduced opportunities for elective surgeries, lower patient volumes, altered clinical rotations, increased reliance on telemedicine, and dependence on virtual didactic conferences. These studies, however, focused on individual specialties.
View Article and Find Full Text PDFBackground: Graduate medical education (GME) program leaders struggle to incorporate quality measures in the ambulatory care setting, leading to knowledge gaps on how to provide feedback to residents and programs. While nationally collected quality of care data are available, their reliability for individual resident learning and for GME program improvement is understudied.
Objective: To examine the reliability of the Healthcare Effectiveness Data and Information Set (HEDIS) clinical performance measures in family medicine and internal medicine GME programs and to determine whether HEDIS measures can inform residents and their programs with their quality of care.
Undergraduate and graduate medical education have long embraced uniqueness and variability in curricular and assessment approaches. Some of this variability is justified (warranted or necessary variation), but a substantial portion represents unwarranted variation. A primary tenet of outcomes-based medical education is ensuring that all learners acquire essential competencies to be publicly accountable to meet societal needs.
View Article and Find Full Text PDFThe graduate medical education (GME) system is heavily subsidized by the public in return for producing physicians who meet society's needs. Under the terms of this implicit social contract, decisions about how this funding is allocated are deferred to the individual training sites. Institutions receiving public funding face potential conflicts of interest, which have at times prioritized institutional purposes and needs over societal needs, highlighting that there is little public accountability for how such funding is used.
View Article and Find Full Text PDFAdvancement toward competency-based medical education (CBME) has been hindered by inertia and a myriad of implementation challenges, including those associated with assessment of competency, accreditation/regulation, and logistical considerations. The COVID-19 pandemic disrupted medical education at every level. Time-in-training sometimes was shortened or significantly altered and there were reductions in the number and variety of clinical exposures.
View Article and Find Full Text PDFIntroduction: Competency-based medical education (CBME) provides a framework for describing learner progression throughout training. However, specific approaches to CBME implementation vary widely across educational settings. Alignment between various methods used across the continuum is critical to support transitions and assess learner performance.
View Article and Find Full Text PDF