A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

How Reliable are Single-Question Workplace-Based Assessments in Surgery? | LitMetric

How Reliable are Single-Question Workplace-Based Assessments in Surgery?

J Surg Educ

Center for Surgical Training and Research, Department of Surgery, University of Michigan 010-A193, North Campus Research Complex 2800, Plymouth Road Ann Arbor, Michigan 48109.

Published: July 2024

AI Article Synopsis

  • The study evaluates inter-rater reliability of workplace-based assessments (WBAs) used to measure surgical trainees' performance, which is crucial as multiple faculty members assess the same residents.
  • Nine faculty members rated surgical residents' videos during operations using several scales, with findings showing low to moderate reliability across these assessments, indicated by intraclass correlation coefficients ranging from 0.33 to 0.47.
  • The research suggests that while training faculty could enhance consistency, a higher volume of assessments from different raters is necessary to accurately evaluate trainee performance over time.

Article Abstract

Objective: Workplace-based assessments (WBAs) play an important role in the assessment of surgical trainees. Because these assessment tools are utilized by a multitude of faculty, inter-rater reliability is important to consider when interpreting WBA data. Although there is evidence supporting the validity of many of these tools, inter-reliability evidence is lacking. This study aimed to evaluate the inter-rater reliability of multiple operative WBA tools utilized in general surgery residency.

Design: General surgery residents and teaching faculty were recorded during 6 general surgery operations. Nine faculty raters each reviewed 6 videos and rated each resident on performance (using the Society for Improving Medical Professional Learning, or SIMPL, Performance Scale as well as the operative performance rating system (OPRS) Scale), entrustment (using the ten Cate Entrustment-Supervision Scale), and autonomy (using the Zwisch Scale). The ratings were reviewed for inter-rater reliability using percent agreement and intraclass correlations.

Participants: Nine faculty members viewed the videos and assigned ratings for multiple WBAs.

Results: Absolute intraclass correlation coefficients for each scale ranged from 0.33 to 0.47.

Conclusions: All single-item WBA scales had low to moderate inter-rater reliability. While rater training may improve inter-rater reliability for single observations, many observations by many raters are needed to reliably assess trainee performance in the workplace.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.jsurg.2024.03.015DOI Listing

Publication Analysis

Top Keywords

inter-rater reliability
20
general surgery
12
workplace-based assessments
8
tools utilized
8
inter-rater
5
reliability
5
scale
5
reliable single-question
4
single-question workplace-based
4
assessments surgery?
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!