Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms.

Neural Process Lett

School of Computer Science and Technology, Tiangong University, Tianjin, 300387 China.

Published: January 2023

The success of deep learning has brought breakthroughs in many fields. However, the increased performance of deep learning models is often accompanied by an increase in their depth and width, which conflicts with the storage, energy consumption, and computational power of edge devices. Knowledge distillation, as an effective model compression method, can transfer knowledge from complex teacher models to student models. Self-distillation is a special type of knowledge distillation, which does not to require a pre-trained teacher model. However, existing self-distillation methods rarely consider how to effectively use the early features of the model. Furthermore, most self-distillation methods use features from the deepest layers of the network to guide the training of the branches of the network, which we find is not the optimal choice. In this paper, we found that the feature maps obtained by early feature fusion do not serve as a good teacher to guide their own training. Based on this, we propose a selective feature fusion module and further obtain a new self-distillation method, knowledge fusion distillation. Extensive experiments on three datasets have demonstrated that our method has comparable performance to state-of-the-art distillation methods. In addition, the performance of the network can be further enhanced when fused features are integrated into the network.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC9807430PMC
http://dx.doi.org/10.1007/s11063-022-11132-wDOI Listing

Publication Analysis

Top Keywords

knowledge fusion
8
fusion distillation
8
deep learning
8
knowledge distillation
8
self-distillation methods
8
guide training
8
feature fusion
8
distillation
6
knowledge
5
distillation improving
4

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!