A PHP Error was encountered

Severity: Warning

Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests

Filename: helpers/my_audit_helper.php

Line Number: 176

Backtrace:

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url

File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML

File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global

File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword

File: /var/www/html/index.php
Line: 316
Function: require_once

Inter- and Intra-Observer Agreement of PD-L1 SP142 Scoring in Breast Carcinoma-A Large Multi-Institutional International Study. | LitMetric

AI Article Synopsis

  • * A study involving 100 core biopsies found that only 52% to 60% of pathologists agreed on PD-L1 scores, although agreement improved with experienced pathologists, especially in the second round of scoring.
  • * The findings indicate that while most pathologists show strong agreement in PD-L1 scoring, low-expressing cases can still be challenging due to technical issues, suggesting the need for improved methods or expert consultations.

Article Abstract

The assessment of PD-L1 expression in TNBC is a prerequisite for selecting patients for immunotherapy. The accurate assessment of PD-L1 is pivotal, but the data suggest poor reproducibility. A total of 100 core biopsies were stained using the VENTANA Roche SP142 assay, scanned and scored by 12 pathologists. Absolute agreement, consensus scoring, Cohen's Kappa and intraclass correlation coefficient (ICC) were assessed. A second scoring round after a washout period to assess intra-observer agreement was carried out. Absolute agreement occurred in 52% and 60% of cases in the first and second round, respectively. Overall agreement was substantial (Kappa 0.654-0.655) and higher for expert pathologists, particularly on scoring TNBC (6.00 vs. 0.568 in the second round). The intra-observer agreement was substantial to almost perfect (Kappa: 0.667-0.956), regardless of PD-L1 scoring experience. The expert scorers were more concordant in evaluating staining percentage compared with the non-experienced scorers (R = 0.920 vs. 0.890). Discordance predominantly occurred in low-expressing cases around the 1% value. Some technical reasons contributed to the discordance. The study shows reassuringly strong inter- and intra-observer concordance among pathologists in PD-L1 scoring. A proportion of low-expressors remain challenging to assess, and these would benefit from addressing the technical issues, testing a different sample and/or referring for expert opinions.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10000421PMC
http://dx.doi.org/10.3390/cancers15051511DOI Listing

Publication Analysis

Top Keywords

intra-observer agreement
12
inter- intra-observer
8
assessment pd-l1
8
absolute agreement
8
second round
8
agreement substantial
8
pd-l1 scoring
8
agreement
6
scoring
6
pd-l1
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!