The objective of this collaborative study was to compare current practices of conducting high-stake, exit-level Objective Structured Clinical Examinations (OSCEs) at all Australian medical schools. We aimed to document similarities and differences between schools, and compare existing practice against available gold standard, evidence-based practice. We also aimed to identify areas where gold standards do not currently exist, and could be developed in the future. A 72-item semi-structured questionnaire was sent to all 19 Australian medical schools with graduating students. A total of 18/19 schools responded. Of these, 16/18 schools had summative exit-level OSCEs representing content from multiple medical specialties. The total number of OSCE stations varied from 8 to 16, with total OSCE testing time ranging from 70 to 160 min. All schools blueprinted their OSCE to their curriculum, and trained simulated patients and examiners. There was variation in the format of marking rubric used. This study has provided insight into the current OSCE practices of the majority of medical schools in Australia. Whilst the comparative data reveal a wide variation in OSCE practices between schools, many recommended "gold standard" OSCE practices are implemented. The collective awareness of our similarities and differences provides us with a baseline platform, as well as an impetus for iterative quality improvement. Such discourse also serves to develop new gold standards in practice where none have previously existed.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1080/0142159X.2018.1487547 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!