Objective: To develop an accurate and automatic segmentation model based on convolution neural network to segment the prostate and its lesion regions.

Methods: Of all 180 subjects, 122 healthy individuals and 58 patients with prostate cancer were included. For each subject, all slices of the prostate were comprised in the DWIs. A novel DCNN is proposed to automatically segment the prostate and its lesion regions. This model is inspired by the U-Net model with the encoding-decoding path as the backbone, importing dense block, attention mechanism techniques, and group norm-Atrous Spatial Pyramidal Pooling. Data augmentation was used to avoid overfitting in training. In the experimental phase, the data set was randomly divided into a training (70%), testing set (30%). four-fold cross-validation methods were used to obtain results for each metric.

Results: The proposed model achieved in terms of Iou, Dice score, accuracy, sensitivity, 95% Hausdorff Distance, 86.82%,93.90%, 94.11%, 93.8%,7.84 for the prostate, 79.2%, 89.51%, 88.43%,89.31%,8.39 for lesion region in segmentation. Compared to the state-of-the-art models, FCN, U-Net, U-Net++, and ResU-Net, the segmentation model achieved more promising results.

Conclusion: The proposed model yielded excellent performance in accurate and automatic segmentation of the prostate and lesion regions, revealing that the novel deep convolutional neural network could be used in clinical disease treatment and diagnosis.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10154598PMC
http://dx.doi.org/10.3389/fonc.2023.1095353DOI Listing

Publication Analysis

Top Keywords

prostate lesion
16
automatic segmentation
12
lesion regions
12
segmentation prostate
8
accurate automatic
8
segmentation model
8
neural network
8
segment prostate
8
proposed model
8
model achieved
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!