Background: Use of videos of surgical and medical techniques for educational purposes has grown over the last years. To our knowledge, there is no validated tool to specifically assess the quality of these types of videos. Our goal was to create an evaluation tool and study its intrarater and interrater reliability and its acceptability. We named our tool UM-OSCAARS (Université de Montréal Objective and Structured Checklist for Assessment of Audiovisual Recordings of Surgeries/techniques).
Methods: UM-OSCAARS is a grid containing 10 criteria, each of which is graded on an ordinal Likert-type scale of 1 to 5 points. We tested the grid with the help of 4 voluntary otolaryngology - head and neck surgery specialists who individually viewed 10 preselected videos. The evaluators graded each criterion for each video. To evaluate intrarater reliability, the evaluation took place in 2 different phases separated by 4 weeks. Interrater reliability was assessed by comparing the 4 topranked videos of each evaluator.
Results: There was almost-perfect agreement among the evaluators regarding the 4 videos that received the highest scores from the evaluators, demonstrating that the tool has excellent interrater reliability. There was excellent test-retest correlation, demonstrating the tool's intrarater reliability.
Conclusion: The UM-OSCAARS has proven to be reliable and acceptable to use, but its validity needs to be more thoroughly assessed. We hope this tool will lead to an improvement in the quality of technical videos used for educational purposes.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8064249 | PMC |
http://dx.doi.org/10.1503/cjs.018418 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!