Dual-scale shifted window attention network for medical image segmentation.

Sci Rep

School of System Design and Intelligent Manufacturing, Southern University of Science and Technology, 1088 Xueyuan Boulevard, Nanshan District, Shenzhen, 518055, China.

Published: July 2024

Swin Transformer is an important work among all the attempts to reduce the computational complexity of Transformers while maintaining its excellent performance in computer vision. Window-based patch self-attention can use the local connectivity of the image features, and the shifted window-based patch self-attention enables the communication of information between different patches in the entire image scope. Through in-depth research on the effects of different sizes of shifted windows on the patch information communication efficiency, this article proposes a Dual-Scale Transformer with double-sized shifted window attention method. The proposed method surpasses CNN-based methods such as U-Net, AttenU-Net, ResU-Net, CE-Net by a considerable margin (Approximately 3% 6% increase), and outperforms the Transformer based models single-scale Swin Transformer(SwinT)(Approximately 1% increase), on the datasets of the Kvasir-SEG, ISIC2017, MICCAI EndoVisSub-Instrument and CadVesSet. The experimental results verify that the proposed dual scale shifted window attention benefits the communication of patch information and can enhance the segmentation results to state of the art. We also implement an ablation study on the effect of the shifted window size on the information flow efficiency and verify that the dual-scale shifted window attention is the optimized network design. Our study highlights the significant impact of network structure design on visual performance, providing valuable insights for the design of networks based on Transformer architectures.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC11291481PMC
http://dx.doi.org/10.1038/s41598-024-68587-1DOI Listing

Publication Analysis

Top Keywords

shifted window
20
window attention
16
dual-scale shifted
8
window-based patch
8
patch self-attention
8
shifted
6
window
5
attention
4
attention network
4
network medical
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!