Curriculum change is relatively frequent in health professional education. Formal, planned curriculum review must be conducted periodically to incorporate new knowledge and skills, changing teaching and learning methods or changing roles and expectations of graduates. Unplanned curriculum evolution arguably happens continually, usually taking the form of "minor" changes that in combination over time may produce a substantially different programme.
View Article and Find Full Text PDFIn the paper, the authors offer perspectives on the uses of technology and assessment, that support learning. The perspectives are viewed through validity (from the field of assessment) as a framework and they discuss four aspects of an interconnected technology, learning and assessment space that represent theory informed, authentic practice. The four are: 1) integrated coherence for learning, assessment and technology; 2) responsibilities for equity, diversity, inclusion and wellbeing; 3) sustainability; and 4) balancing resources in global contexts.
View Article and Find Full Text PDFIntroduction: The Ottawa Conference on the Assessment of Competence in Medicine and the Healthcare Professions was first convened in 1985 in Ottawa. Since then, what has become known as the Ottawa conference has been held in various locations around the world every 2 years. It has become an important conference for the community of assessment - including researchers, educators, administrators and leaders - to share contemporary knowledge and develop international standards for assessment in medical and health professions education.
View Article and Find Full Text PDFIntroduction: This study aimed to explore the decision-making processes of raters during objective structured clinical examinations (OSCEs), in particular to explore the tacit assumptions and beliefs of raters as well as rater idiosyncrasies.
Methods: Thinking aloud protocol interviews were used to gather data on the thoughts of examiners during their decision-making, while watching trigger OSCE videos and rating candidates. A purposeful recruiting strategy was taken, with a view to interviewing both examiners with many years of experience (greater than six years) and those with less experience examining at final medical examination level.
A preparatory framework called EASI (Evaluate, Align, Student-centred, Implement and Improve) was developed with the aim of creating awareness about interim options and implementation opportunities for online Clinical and Communication Skills (CCS) learning. The framework, when applied requires faculty to evaluate current resources, align sessions to learning outcomes with student-centred approaches and to continuously improve based on implementation experiences. Using the framework, we were able to generate various types of online CCS learning sessions for implementation in a short period of time due to the recent Covid-19 pandemic.
View Article and Find Full Text PDFIntroduction: In 2011 the Consensus Statement on Performance Assessment was published in Medical Teacher. That paper was commissioned by AMEE (Association for Medical Education in Europe) as part of the series of Consensus Statements following the 2010 Ottawa Conference. In 2019, it was recommended that a working group be reconvened to review and consider developments in performance assessment since the 2011 publication.
View Article and Find Full Text PDFThis article was migrated. The article was marked as recommended. The impact of the COVID-19 pandemic on the teaching and assessment of clinical skills continues to pose significant challenges for healthcare education providers worldwide.
View Article and Find Full Text PDFThis article was migrated. The article was marked as recommended. The COVID-19 pandemic has presented significant challenges for medical schools.
View Article and Find Full Text PDFThis paper reports on a study that compares estimates of the reliability of a suite of workplace based assessment forms as employed to formatively assess the progress of trainee obstetricians and gynaecologists. The use of such forms of assessment is growing nationally and internationally in many specialties, but there is little research evidence on comparisons by procedure/competency and form-type across an entire specialty. Generalisability theory combined with a multilevel modelling approach is used to estimate variance components, G-coefficients and standard errors of measurement across 13 procedures and three form-types (mini-CEX, OSATS and CbD).
View Article and Find Full Text PDFBest Pract Res Clin Obstet Gynaecol
December 2010
Workplace assessment has been incorporated into speciality training in the UK following changes in the training and work patterns within the National Health Service (NHS). There are various types of assessment tools that have been adopted to assess the clinical competence of trainees. In obstetrics and gynaecology, these include mini-CEX, Objective Structured Assessment of Technical skills (OSATS) and case-based discussion (CbDs).
View Article and Find Full Text PDFClinical teachers are often involved in assessing clinical competence in the workplace, in universities and colleges. Assessments commonly used to formally assess clinical competence include long and short cases and the objective structured clinical examination which, if well designed, is a fair and reliable method of assessing clinical competence.
View Article and Find Full Text PDFBackground: The UK General Medical Council (GMC) in its regulatory capacity conducts formal tests of competence (TOCs) on doctors whose performance is of concern. TOCs are individually tailored to each doctor's specialty and grade.
Aims: To describe the development and implementation of an electronic blueprinting system that supports the delivery of TOCs.
Context: High-stakes undergraduate clinical assessments should be based on transparent standards comparable between different medical schools. However, simply sharing questions and pass marks may not ensure comparable standards and judgements. We hypothesised that in multicentre examinations, teaching institutions contribute to systematic variations in students' marks between different medical schools through the behaviour of their markers, standard-setters and simulated patients.
View Article and Find Full Text PDFBackground: While all graduates from medical schools in the UK are granted the same licence to practise by the medical professional regulatory body, the General Medical Council, individuals institution set their own graduating examination systems. Previous studies have suggested that the equivalence of passing standards across different medical schools cannot be guaranteed.
Aims: To explore and formally document the graduating examinations being used in the UK Medical Schools and to evaluate whether it is possible to make plausible comparisons in relation to the standard of clinical competence of graduates.
Context: Medical schools in the UK set their own graduating examinations and pass marks. In a previous study we examined the equivalence of passing standards using the Angoff standard-setting method. To address the limitation this imposed on that work, we undertook further research using a standard-setting method specifically designed for objective structured clinical examinations (OSCEs).
View Article and Find Full Text PDFBackground: In our region, it was acknowledged that the process of assessment needed to be improved, but before developing a system for this, there was a need to define the "competent or satisfactory trainee".
Objective: To outline the process by which a consensus was achieved on this standard, and how a system for formally assessing competency across a wide range of knowledge skills and attitudes was subsequently agreed on, thus enabling increased opportunities for training and feedback and improving the accuracy of assessment in the region.
Methods: The opinions of trainees and trainers from across the region were collated, and a consensus was achieved with regard to the minimum acceptable standard for a trainee in emergency medicine, thus defining a competent trainee.
While Objective Structured Clinical Examinations (OSCEs) have become widely used to assess clinical competence at the end of undergraduate medical courses, the method of setting the passing score varies greatly, and there is no agreed best methodology. While there is an assumption that the passing standard at graduation is the same at all medical schools, there is very little quantitative evidence in the field. In the United Kingdom, there is no national licensing examination; each medical school sets its own graduating assessment and successful completion by candidates leads to the licensed right to practice by the General Medical Council.
View Article and Find Full Text PDFThe Handwashing Liaison Group has pointed out that "The failure of healthcare workers to decontaminate their hands reflects fundamentals of attitudes, beliefs and behaviours". Doctors are known to be poor at handwashing. This poor compliance may have its roots in a failure to learn this behaviour at medical college, where the influence of consultants and other role models may be critical.
View Article and Find Full Text PDF