The investigation into individual survival rates within the patient population was typically conducted using the Cox proportional hazards model. This study was aimed to evaluate the performance of machine learning algorithm in predicting survival rates more than 5 years for individual patients with colorectal cancer. A total of 475 patients with colorectal cancer (CRC) and complete data who had underwent surgery for CRC were analyze to measure individual's survival rate more than 5 years using a machine learning based on penalized Cox regression. We conducted thorough calculations to measure the individual's survival rate more than 5 years for performance evaluation. The receiver operating characteristic curves for the LASSO penalized model, the SCAD penalized model, the unpenalized model, and the RSF model were analyzed. The least absolute shrinkage and selection operator penalized model displayed a mean AUC of 0.67 ± 0.06, the smoothly clipped absolute deviation penalized model exhibited a mean AUC of 0.65 ± 0.07, the unpenalized model showed a mean AUC of 0.64 ± 0.09. Notably, the random survival forests model outperformed the others, demonstrating the most favorable performance evaluation with a mean AUC of 0.71 ± 0.05. Compared to the conventional unpenalized Cox model, recent machine learning techniques (LASSO, SCAD, RSF) showed advantages for data interpretation.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11175897 | PMC |
http://dx.doi.org/10.1097/MD.0000000000038584 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!