A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 1034
Function: getPubMedXML

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3152
Function: GetPubMedArticleOutput_2016

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Interrater Reliability in Toxicity Identification: Limitations of Current Standards. | LitMetric

Interrater Reliability in Toxicity Identification: Limitations of Current Standards.

Int J Radiat Oncol Biol Phys

Department of Radiation Oncology, Duke University, Durham, North Carolina; Department of Radiation Oncology, University of California, San Francisco, San Francisco, California; Bakar Computational Health Sciences Institute, University of California, San Francisco, San Francisco, California. Electronic address:

Published: August 2020

AI Article Synopsis

  • The study evaluated the interrater reliability of the CTCAE v5.0 for identifying and grading oncology-related toxicities, revealing significant discrepancies among reviewers.
  • Two reviewers analyzed 100 patient notes, and a third reviewer helped resolve disagreements, using statistical measures to assess reliability.
  • Results showed a moderate level of agreement, indicating that assessing toxicity can be challenging and highlighting the need for better training and simplification of criteria in clinical trials.

Article Abstract

Purpose: The National Cancer Institute Common Terminology Criteria for Adverse Events (CTCAE) v5.0 is the standard for oncology toxicity encoding and grading, despite limited validation. We assessed interrater reliability (IRR) in multireviewer toxicity identification.

Methods And Materials: Two reviewers independently reviewed 100 randomly selected notes for weekly on-treatment visits during radiation therapy from the electronic health record. Discrepancies were adjudicated by a third reviewer for consensus. Term harmonization was performed to account for overlapping symptoms in CTCAE. IRR was assessed based on unweighted and weighted Cohen's kappa coefficients.

Results: Between reviewers, the unweighted kappa was 0.68 (95% confidence interval, 0.65-0.71) and the weighted kappa was 0.59 (0.22-1.00). IRR was consistent between symptoms noted as present or absent with a kappa of 0.6 (0.66-0.71) and 0.6 (0.65-0.69), respectively.

Conclusions: Significant discordance suggests toxicity identification, particularly retrospectively, is a complex and error-prone task. Strategies to minimize IRR, including training and simplification of the CTCAE criteria, should be considered in trial design and future terminologies.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.ijrobp.2020.04.040DOI Listing

Publication Analysis

Top Keywords

interrater reliability
8
toxicity identification
8
toxicity
4
reliability toxicity
4
identification limitations
4
limitations current
4
current standards
4
standards purpose
4
purpose national
4
national cancer
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!