Introduction: Radiomics is a promising imaging-based tool which could enhance clinical observation and identify representative features. To avoid different interpretations, the Image Biomarker Standardisation Initiative (IBSI) imposed conditions for harmonisation. This study evaluates IBSI-compliant radiomics applications against a known benchmark and clinical datasets for agreements.
Materials And Methods: The three radiomics platforms compared were RadiomiX Research Toolbox, LIFEx v7.0.0, and syngo.via Frontier Radiomics v1.2.5 (based on PyRadiomics v2.1). Basic assessment included comparing feature names and their formulas. The IBSI digital phantom was used for evaluation against reference values. For agreement evaluation (including same software but different versions), two clinical datasets were used: 27 contrast-enhanced computed tomography (CECT) of colorectal liver metastases and 39 magnetic resonance imaging (MRI) of breast cancer, including intravoxel incoherent motion (IVIM) and dynamic contrast-enhanced (DCE) MRI. The intraclass correlation coefficient (ICC, lower 95% confidence interval) was used, with 0.9 as the threshold for excellent agreement.
Results: The three radiomics applications share 41 (3 shape, 8 intensity, 30 texture) out of 172, 84 and 110 features for RadiomiX, LIFEx and syngo.via, respectively, as well as wavelet filtering. The naming convention is, however, different between them. Syngo.via had excellent agreement with the IBSI benchmark, while LIFEx and RadiomiX showed slightly worse agreement. Excellent reproducibility was achieved for shape features only, while intensity and texture features varied considerably with the imaging type. For intensity, excellent agreement ranged from 46% for the DCE maps to 100% for CECT, while this lowered to 44% and 73% for texture features, respectively. Wavelet features produced the greatest variation between applications, with an excellent agreement for only 3% to 11% features.
Conclusion: Even with IBSI-compliance, the reproducibility of features between radiomics applications is not guaranteed. To evaluate variation, quality assurance of radiomics applications should be performed and repeated when updating to a new version or adding a new modality.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1088/2057-1976/ac8e6f | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!