Early screening of COVID-19 is essential for pandemic control, and thus to relieve stress on the health care system. Lung segmentation from chest X-ray (CXR) is a promising method for early diagnoses of pulmonary diseases. Recently, deep learning has achieved great success in supervised lung segmentation. However, how to effectively utilize the lung region in screening COVID-19 still remains a challenge due to domain shift and lack of manual pixel-level annotations. We hereby propose a multi-appearance COVID-19 screening framework by using lung region priors derived from CXR images. Firstly, we propose a multi-scale adversarial domain adaptation network (MS-AdaNet) to boost the cross-domain lung segmentation task as the prior knowledge to the classification network. Then, we construct a multi-appearance network (MA-Net), which is composed of three sub-networks to realize multi-appearance feature extraction and fusion using lung region priors. At last, we can obtain prediction results from normal, viral pneumonia, and COVID-19 using the proposed MA-Net. We extend the proposed MS-AdaNet for lung segmentation task on three different public CXR datasets. The results suggest that the MS-AdaNet outperforms contrastive methods in cross-domain lung segmentation. Moreover, experiments reveal that the proposed MA-Net achieves accuracy of 98.83 % and F1-score of 98.71 % on COVID-19 screening. The results indicate that the proposed MA-Net can obtain significant performance on COVID-19 screening.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC8843059PMC
http://dx.doi.org/10.1109/JBHI.2021.3104629DOI Listing

Publication Analysis

Top Keywords

lung segmentation
20
covid-19 screening
16
lung region
16
region priors
12
proposed ma-net
12
lung
9
chest x-ray
8
screening covid-19
8
cross-domain lung
8
segmentation task
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!