FPWT: Filter pruning via wavelet transform for CNNs.

Neural Netw

School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200444, China.

Published: November 2024

AI Article Synopsis

  • Convolutional Neural Networks (CNNs) need a lot of data and computing power, making it hard to use them on mobile devices.
  • To fix this, scientists use a method called filter pruning, which helps make CNNs work better by removing unnecessary parts.
  • This study introduces a new pruning method called FPWT, which looks at how images are organized in a different way (using wavelet transform) and helps improve the efficiency and accuracy of CNNs for image classification.

Article Abstract

The enormous data and computational resources required by Convolutional Neural Networks (CNNs) hinder the practical application on mobile devices. To solve this restrictive problem, filter pruning has become one of the practical approaches. At present, most existing pruning methods are currently developed and practiced with respect to the spatial domain, which ignores the potential interconnections in the model structure and the decentralized distribution of image energy in the spatial domain. The image frequency domain transform method can remove the correlation between image pixels and concentrate the image energy distribution, which results in lossy compression of images. In this study, we find that the frequency domain transform method is also applicable to the feature maps of CNNs. The filter pruning via wavelet transform (WT) is proposed in this paper (FPWT), which combines the frequency domain information of WT with the output feature map to more obviously find the correlation between feature maps and make the energy into a relatively concentrated distribution in the frequency domain. Moreover, the importance score of each feature map is calculated by the cosine similarity and the energy-weighted coefficients of the high and low frequency components, and prune the filter based on its importance score. Experiments on two image classification datasets validate the effectiveness of FPWT. For ResNet-110 on CIFAR-10, FPWT reduces FLOPs and parameters by more than 60.0 % with 0.53 % accuracy improvement. For ResNet-50 on ImageNet, FPWT reduces FLOPs by 53.8 % and removes parameters by 49.7 % with only 0.97 % loss of Top-1 accuracy.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2024.106577DOI Listing

Publication Analysis

Top Keywords

frequency domain
16
filter pruning
12
pruning wavelet
8
wavelet transform
8
spatial domain
8
image energy
8
domain transform
8
transform method
8
feature maps
8
feature map
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!