Learning global dependencies based on hierarchical full connection for brain tumor segmentation.

Comput Methods Programs Biomed

School of Computer and Computational Science, Zhejiang University City College, Hangzhou, 310011, China.

Published: June 2022

Background And Objective: Because the appearance, shape and location of brain tumors vary greatly among different patients, brain tumor segmentation (BTS) is extremely challenging. Recently, many studies have used attention mechanisms to solve this problem, which can be roughly divided into two categories: the spatial attention based on convolution (with or without channel attention) and self-attention. Due to the limitation of convolution operations, the spatial attention based on convolution cannot learn global dependencies very well, resulting in poor performance in BTS. A simple improvement idea is to directly substitute it with self-attention, which has an excellent ability to learn global dependencies. Since self-attention is not friendly to GPU memory, this simple substitution will make the new attention mechanism unable to be applied to high-resolution low-level feature maps, which contain considerable geometric information and are also important for improving the performance of attention mechanism in BTS.

Method: In this paper, we propose a hierarchical fully connected module, named H-FC, to learn global dependencies. H-FC learns local dependencies at different feature map scales through fully connected layers hierarchically, and then combines these local dependencies as approximations of the global dependencies. H-FC requires very little GPU memory and can easily replace spatial attention module based on convolution operation, such as Attention Gate and SAM (in CBAM), to improve the performance of attention mechanisms in BTS.

Results: Adequate comparative experiments illustrate that H-FC performs better than Attention Gate and SAM (in CBAM), which lack the ability to learn global dependencies, in BTS, with improvements in most metrics and a larger improvement in Hausdorff Distance. By comparing the amount of calculation and parameters of the model before and after adding H-FC, it is prove that H-FC is light-weight.

Conclusion: In this paper, we propose a novel H-FC to learn global dependencies. We illustrate the effectiveness of H-FC through experiments on BraTS2020 dataset. We mainly explore the influence of the region size and the number of steps on the performance of H-FC. We also confirm that the global dependencies of low-level feature maps are also important to BTS. We show that H-FC is light-weight through a time and space complexity analysis and the experimental results.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.cmpb.2022.106925DOI Listing

Publication Analysis

Top Keywords

global dependencies
32
learn global
20
spatial attention
12
based convolution
12
dependencies
10
attention
10
h-fc
10
brain tumor
8
tumor segmentation
8
attention mechanisms
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!