Background: Little is known about the reliability of data collected by abstractors without professional medical training. This investigation sought to determine the level of agreement among untrained volunteer abstractors as part of a study to evaluate the risk assessment of venous thromboembolism in patients who have undergone trauma.

Methods: Forty-nine paper charts were chosen randomly from a volunteer-reviewed cohort of 2,339 and were compared with those of a single experienced abstractor. Inter-rater agreement was assessed using percent agreement, Cohen's kappa, and prevalence-adjusted bias-adjusted kappa (PABAK).

Results: Of the 71 data points, 28 had perfect agreement. The average agreement across all charts was 97%. Data with imperfect agreement had kappa values between .27 and .96 (mean, .75), with one additional value at zero even though it was associated with an agreement of 94%. PABAK values ranged from .67 to .98 (mean, .91), an average increase of .17 compared with kappa values.

Conclusions: The performance of volunteers showed outstanding inter-rater reliability; however, limitations of interpretation can influence reliability.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.amjsurg.2013.01.021DOI Listing

Publication Analysis

Top Keywords

inter-rater reliability
8
agreement
7
volunteer student
4
student abstractors
4
abstractors retrospective
4
retrospective cohort
4
cohort analysis
4
analysis study
4
study inter-rater
4
reliability
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!