Objectives: This study aimed to develop a deep learning radiomic model using multimodal imaging to differentiate benign and malignant breast tumours.
Methods: Multimodality imaging data, including ultrasonography (US), mammography (MG), and magnetic resonance imaging (MRI), from 322 patients (112 with benign breast tumours and 210 with malignant breast tumours) with histopathologically confirmed breast tumours were retrospectively collected between December 2018 and May 2023. Based on multimodal imaging, the experiment was divided into three parts: traditional radiomics, deep learning radiomics, and feature fusion.
This study aims to establish an effective benign and malignant classification model for breast tumor ultrasound images by using conventional radiomics and transfer learning features. We collaborated with a local hospital and collected a base dataset (Dataset A) consisting of 1050 cases of single lesion 2D ultrasound images from patients, with a total of 593 benign and 357 malignant tumor cases. The experimental approach comprises three main parts: conventional radiomics, transfer learning, and feature fusion.
View Article and Find Full Text PDFThe purpose of this study was to fuse conventional radiomic and deep features from digital breast tomosynthesis craniocaudal projection (DBT-CC) and ultrasound (US) images to establish a multimodal benign-malignant classification model and evaluate its clinical value. Data were obtained from a total of 487 patients at three centers, each of whom underwent DBT-CC and US examinations. A total of 322 patients from dataset 1 were used to construct the model, while 165 patients from datasets 2 and 3 formed the prospective testing cohort.
View Article and Find Full Text PDFBreast and thyroid cancers are the two most common cancers among women worldwide. The early clinical diagnosis of breast and thyroid cancers often utilizes ultrasonography. Most of the ultrasound images of breast and thyroid cancer lack specificity, which reduces the accuracy of ultrasound clinical diagnosis.
View Article and Find Full Text PDFBackground: The rapid development of artificial intelligence technology has improved the capability of automatic breast cancer diagnosis, compared to traditional machine learning methods. Convolutional Neural Network (CNN) can automatically select high efficiency features, which helps to improve the level of computer-aided diagnosis (CAD). It can improve the performance of distinguishing benign and malignant breast ultrasound (BUS) tumor images, making rapid breast tumor screening possible.
View Article and Find Full Text PDF