Cluster-Based Structural Redundancy Identification for Neural Network Compression.

Entropy (Basel)

State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China.

Published: December 2022

The increasingly large structure of neural networks makes it difficult to deploy on edge devices with limited computing resources. Network pruning has become one of the most successful model compression methods in recent years. Existing works typically compress models based on importance, removing unimportant filters. This paper reconsiders model pruning from the perspective of structural redundancy, claiming that identifying functionally similar filters plays a more important role, and proposes a model pruning framework for clustering-based redundancy identification. First, we perform cluster analysis on the filters of each layer to generate similar sets with different functions. We then propose a criterion for identifying redundant filters within similar sets. Finally, we propose a pruning scheme that automatically determines the pruning rate of each layer. Extensive experiments on various benchmark network architectures and datasets demonstrate the effectiveness of our proposed framework.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9857617PMC
http://dx.doi.org/10.3390/e25010009DOI Listing

Publication Analysis

Top Keywords

structural redundancy
8
redundancy identification
8
model pruning
8
pruning
5
cluster-based structural
4
identification neural
4
neural network
4
network compression
4
compression increasingly
4
increasingly large
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!