Reducing CT radiation dose is an often proposed measure to enhance patient safety, which, however results in increased image noise, translating into degradation of clinical image quality. Several deep learning methods have been proposed for low-dose CT (LDCT) denoising. The high risks posed by possible hallucinations in clinical images necessitate methods which aid the interpretation of deep learning networks. In this study, we aim to use qualitative reader studies and quantitative radiomics studies to assess the perceived quality, signal preservation and statistical feature preservation of LDCT volumes denoised by deep learning. We aim to compare interpretable deep learning methods with classical deep neural networks in clinical denoising performance.We conducted an image quality analysis study to assess the image quality of the denoised volumes based on four criteria to assess the perceived image quality. We subsequently conduct a lesion detection/segmentation study to assess the impact of denoising on signal detectability. Finally, a radiomic analysis study was performed to observe the quantitative and statistical similarity of the denoised images to standard dose CT (SDCT) images.The use of specific deep learning based algorithms generate denoised volumes which are qualitatively inferior to SDCT volumes(< 0.05). Contrary to previous literature, denoising the volumes did not reduce the accuracy of the segmentation (> 0.05). The denoised volumes, in most cases, generated radiomics features which were statistically similar to those generated from SDCT volumes (> 0.05).Our results show that the denoised volumes have a lower perceived quality than SDCT volumes. Noise and denoising do not significantly affect detectability of the abdominal lesions. Denoised volumes also contain statistically identical features to SDCT volumes.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1088/1361-6560/acfc11 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!