Severity: Warning
Message: file_get_contents(https://...@gmail.com&api_key=61f08fa0b96a73de8c900d749fcb997acc09&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Background: Low nuclear grade ductal carcinoma in situ (DCIS) patients can adopt proactive management strategies to avoid unnecessary surgical resection. Different personalized treatment modalities may be selected based on the expression status of molecular markers, which is also predictive of different outcomes and risks of recurrence. DCIS ultrasound findings are mostly non mass lesions, making it difficult to determine boundaries. Currently, studies have shown that models based on deep learning radiomics (DLR) have advantages in automatic recognition of tumor contours. Machine learning models based on clinical imaging features can explain the importance of imaging features.
Methods: The available ultrasound data of 349 patients with pure DCIS confirmed by surgical pathology [54 low nuclear grade, 175 positive estrogen receptor (ER+), 163 positive progesterone receptor (PR+), and 81 positive human epidermal growth factor receptor 2 (HER2+)] were collected. Radiologists extracted ultrasonographic features of DCIS lesions based on the 5 Edition of Breast Imaging Reporting and Data System (BI-RADS). Patient age and BI-RADS characteristics were used to construct clinical machine learning (CML) models. The RadImageNet pretrained network was used for extracting radiomics features and as an input for DLR modeling. For training and validation datasets, 80% and 20% of the data, respectively, were used. Logistic regression (LR), support vector machine (SVM), random forest (RF), and eXtreme Gradient Boosting (XGBoost) algorithms were performed and compared for the final classification modeling. Each task used the area under the receiver operating characteristic curve (AUC) to evaluate the effectiveness of DLR and CML models.
Results: In the training dataset, low nuclear grade, ER+, PR+, and HER2+ DCIS lesions accounted for 19.20%, 65.12%, 61.21%, and 30.19%, respectively; the validation set, they consisted of 19.30%, 62.50%, 57.14%, and 30.91%, respectively. In the DLR models we developed, the best AUC values for identifying features were 0.633 for identifying low nuclear grade, completed by the XGBoost Classifier of ResNet50; 0.618 for identifying ER, completed by the RF Classifier of InceptionV3; 0.755 for identifying PR, completed by the XGBoost Classifier of InceptionV3; and 0.713 for identifying HER2, completed by the LR Classifier of ResNet50. The CML models had better performance than DLR in predicting low nuclear grade, ER+, PR+, and HER2+ DCIS lesions. The best AUC values by classification were as follows: for low nuclear grade by RF classification, AUC: 0.719; for ER+ by XGBoost classification, AUC: 0.761; for PR+ by XGBoost classification, AUC: 0.780; and for HER2+ by RF classification, AUC: 0.723.
Conclusions: Based on small-scale datasets, our study showed that the DLR models developed using RadImageNet pretrained network and CML models may help predict low nuclear grade, ER+, PR+, and HER2+ DCIS lesions so that patients benefit from hierarchical and personalized treatment.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11074652 | PMC |
http://dx.doi.org/10.21037/gs-23-417 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!