A paired kappa to compare binary ratings across two medical tests.

Stat Med

Department of Statistics, University of South Carolina, Columbia, South Carolina.

Published: July 2019

Agreement between experts' ratings is an important prerequisite for an effective screening procedure. In clinical settings, large-scale studies are often conducted to compare the agreement of experts' ratings between new and existing medical tests, for example, digital versus film mammography. Challenges arise in these studies where many experts rate the same sample of patients undergoing two medical tests, leading to a complex correlation structure between experts' ratings. Here, we propose a novel paired kappa measure to compare the agreement between the binary ratings of many experts across two medical tests. Existing approaches can accommodate only a small number of experts, rely heavily on Cohen's kappa and Scott's pi measures of agreement, and thus are prone to their drawbacks. The proposed kappa appropriately accounts for correlations between ratings due to patient characteristics, corrects for agreement due to chance, and is robust to disease prevalence and other flaws inherent in the use of Cohen's kappa. It can be easily calculated in the software package R. In contrast to existing approaches, the proposed measure can flexibly incorporate large numbers of experts and patients by utilizing the generalized linear mixed models framework. It is intended to be used in population-based studies, increasing efficiency without increasing modeling complexity. Extensive simulation studies demonstrate low bias and excellent coverage probability of the proposed kappa under a broad range of conditions. Methods are applied to a recent nationwide breast cancer screening study comparing film mammography to digital mammography.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6884009PMC
http://dx.doi.org/10.1002/sim.8200DOI Listing

Publication Analysis

Top Keywords

medical tests
16
experts' ratings
12
paired kappa
8
binary ratings
8
agreement experts'
8
compare agreement
8
film mammography
8
existing approaches
8
cohen's kappa
8
proposed kappa
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!