Consensus on how to assess non-technical skills is lacking. This systematic review aimed to evaluate the evidence regarding non-technical skills assessments in undergraduate medical education, to describe the tools used, learning outcomes and the validity, reliability and psychometrics of the instruments. A standardized search of online databases was conducted and consensus reached on included studies. Data extraction, quality assessment, and content analysis were conducted per Best Evidence in Medical Education guidelines. Nine papers met the inclusion criteria. Assessment methods broadly fell into three categories: simulated clinical scenarios, objective structured clinical examinations, and questionnaires or written assessments. Tools to assess non-technical skills were often developed locally, without reference to conceptual frameworks. Consequently, the tools were rarely validated, limiting dissemination and replication. There were clear themes in content and broad categories in methods of assessments employed. The quality of this evidence was poor due to lack of theoretical underpinning, with most assessments not part of normal process, but rather produced as a specific outcome measure for a teaching-based study. While the current literature forms a good starting position for educators developing materials, there is a need for future work to address these weaknesses as such tools are required across health education.

Download full-text PDF

Source
http://dx.doi.org/10.1080/0142159X.2018.1562166DOI Listing

Publication Analysis

Top Keywords

non-technical skills
16
medical education
12
skills assessments
8
assessments undergraduate
8
undergraduate medical
8
systematic review
8
assess non-technical
8
assessments
5
non-technical
4
education
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!