Multilayer Semantic Features Adaptive Distillation for Object Detectors.

Sensors (Basel)

Key Laboratory of Smart Agriculture and Forestry, College of Computer and Information Sciences, Fujian Agriculture and Forestry University, Fuzhou 350002, China.

Published: September 2023

Knowledge distillation (KD) is a well-established technique for compressing neural networks and has gained increasing attention in object detection tasks. However, typical object detection distillation methods use fixed-level semantic features for distillation, which might not be best for all training stages and samples. In this paper, a multilayer semantic feature adaptive distillation (MSFAD) method is proposed that uses a routing network composed of a teacher and a student detector, along with an agent network for decision making. Specifically, the inputs to the proxy network consist of the features output by the neck structures of the teacher and student detectors, and the output is a decision on which features to choose for distillation. The MSFAD method improves the distillation training process by enabling the student detector to automatically select valuable semantic-level features from the teacher detector. Experimental results demonstrated that the proposed method increased the mAP of YOLOv5s by 3.4% and the mAP by 3.3%. Additionally, YOLOv5n with only 1.9 M parameters achieved detection performance comparable to that of YOLOv5s.

Download full-text PDF

Source
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10490649PMC
http://dx.doi.org/10.3390/s23177613DOI Listing

Publication Analysis

Top Keywords

multilayer semantic
8
semantic features
8
adaptive distillation
8
object detection
8
distillation msfad
8
msfad method
8
teacher student
8
student detector
8
distillation
7
features
5

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!