SLRProp: A Back-Propagation Variant of Sparse Low Rank Method for DNNs Reduction.

Sensors (Basel)

TECNALIA, Basque Research and Technology Alliance (BRTA), 20009 San Sebastian, Spain.

Published: March 2023

Application of deep neural networks (DNN) in edge computing has emerged as a consequence of the need of real time and distributed response of different devices in a large number of scenarios. To this end, shredding these original structures is urgent due to the high number of parameters needed to represent them. As a consequence, the most representative components of different layers are kept in order to maintain the network's accuracy as close as possible to the entire network's ones. To do so, two different approaches have been developed in this work. First, the Sparse Low Rank Method (SLR) has been applied to two different Fully Connected (FC) layers to watch their effect on the final response, and the method has been applied to the latest of these layers as a duplicate. On the contrary, SLRProp has been proposed as a variant case, where the relevances of the previous FC layer's components were weighed as the sum of the products of each of these neurons' absolute values and the relevances of the neurons from the last FC layer that are connected with the neurons from the previous FC layer. Thus, the relationship of relevances across layer was considered. Experiments have been carried out in well-known architectures to conclude whether the relevances throughout layers have less effect on the final response of the network than the independent relevances intra-layer.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10006865PMC
http://dx.doi.org/10.3390/s23052718DOI Listing

Publication Analysis

Top Keywords

sparse low
8
low rank
8
rank method
8
final response
8
relevances
5
slrprop back-propagation
4
back-propagation variant
4
variant sparse
4
method dnns
4
dnns reduction
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!