On the compression of neural networks using ℓ-norm regularization and weight pruning.

Neural Netw

LINSE-Circuits and Signal Processing Laboratory, Department of Electrical Engineering, Federal University of Santa Catarina, Florianópolis, 88040-900, Brazil.

Published: March 2024

Despite the growing availability of high-capacity computational platforms, implementation complexity still has been a great concern for the real-world deployment of neural networks. This concern is not exclusively due to the huge costs of state-of-the-art network architectures, but also due to the recent push towards edge intelligence and the use of neural networks in embedded applications. In this context, network compression techniques have been gaining interest due to their ability for reducing deployment costs while keeping inference accuracy at satisfactory levels. The present paper is dedicated to the development of a novel compression scheme for neural networks. To this end, a new form of ℓ-norm-based regularization is firstly developed, which is capable of inducing strong sparseness in the network during training. Then, targeting the smaller weights of the trained network with pruning techniques, smaller yet highly effective networks can be obtained. The proposed compression scheme also involves the use of ℓ-norm regularization to avoid overfitting as well as fine tuning to improve the performance of the pruned network. Experimental results are presented aiming to show the effectiveness of the proposed scheme as well as to make comparisons with competing approaches.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2023.12.019DOI Listing

Publication Analysis

Top Keywords

neural networks
16
ℓ-norm regularization
8
compression scheme
8
networks
5
network
5
compression
4
compression neural
4
networks ℓ-norm
4
regularization weight
4
weight pruning
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!