Autonomous underwater vehicles (AUVs) equipped with the intelligent underwater object detection technique is of great significance for underwater navigation. Advanced underwater object detection frameworks adopt skip connections to enhance the feature representation which further boosts the detection precision. However, we reveal two limitations of standard skip connections: 1) standard skip connections do not consider the feature heterogeneity, resulting in a sub-optimal feature fusion strategy; 2) feature redundancy exists in the skip connected features that not all the channels in the fused feature maps are equally important, the network learning should focus on the informative channels rather than the redundant ones. In this paper, we propose a novel channel-weighted skip connection network (CWSCNet) to learn multiple hyper fusion features for improving multi-scale underwater object detection. In CWSCNet, a novel feature fusion module, named channel-weighted skip connection (CWSC), is proposed to adaptively adjust the importance of different channels during feature fusion. The CWSC module removes feature heterogeneity that strengthens the compatibility of different feature maps, it also works as an effective feature selection strategy that enables CWSCNet to focus on learning channels with more object-related information. Extensive experiments on three underwater object detection datasets RUOD, URPC2017 and URPC2018 show that the proposed CWSCNet achieves comparable or state-of-the-art performances in underwater object detection.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TIP.2024.3457246DOI Listing

Publication Analysis

Top Keywords

underwater object
24
object detection
24
channel-weighted skip
12
skip connection
12
skip connections
12
feature fusion
12
feature
10
connection network
8
underwater
8
standard skip
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!