Knowledge distillation, which aims to transfer the knowledge learned by a cumbersome teacher model to a lightweight student model, has become one of the most popular and effective techniques in computer vision. However, many previous knowledge distillation methods are designed for image classification and fail in more challenging tasks such as object detection. In this paper, we first suggest that the failure of knowledge distillation on object detection is mainly caused by two reasons: (1) the imbalance between pixels of foreground and background and (2) lack of knowledge distillation on the relation among different pixels. Then, we propose a structured knowledge distillation scheme, including attention-guided distillation and non-local distillation to address the two issues, respectively. Attention-guided distillation is proposed to find the crucial pixels of foreground objects with an attention mechanism and then make the students take more effort to learn their features. Non-local distillation is proposed to enable students to learn not only the feature of an individual pixel but also the relation between different pixels captured by non-local modules. Experimental results have demonstrated the effectiveness of our method on thirteen kinds of object detection models with twelve comparison methods for both object detection and instance segmentation. For instance, Faster RCNN with our distillation achieves 43.9 mAP on MS COCO2017, which is 4.1 higher than the baseline. Additionally, we show that our method is also beneficial to the robustness and domain generalization ability of detectors. Codes and model weights have been released on GitHub.
Download full-text PDF |
Source |
---|---|
http://dx.doi.org/10.1109/TPAMI.2023.3300470 | DOI Listing |
For wavelength division multiplexing (WDM) systems, excessive linear and nonlinear noise will seriously decrease the quality of optical signals, and the effective joint monitoring scheme can prevent the degradation of system performance due to noise accumulation. In this paper, we propose a probability information assisted knowledge distillation (PIAKD) scheme that achieves intelligent joint monitoring for linear signal-to-noise ratio (SNRL) and nonlinear signal-to-noise ratio (SNRNL) in WDM systems. Under the condition of multi-task regression, outputs are independent and continuous, PIAKD addresses the longstanding challenge that the student model fails to effectively learn knowledge from the teacher model by introducing probability information into the loss function.
View Article and Find Full Text PDFPlant Dis
January 2025
Universidad de Chile, Departamento de Sanidad Vegetal, Facultad de Ciencias Agronomicas, Casilla 1004, Santiago, Chile, 8820000;
Walnut (Juglans regia L.) is the primary nut tree cultivated in Chile, covering 44.626 ha.
View Article and Find Full Text PDFPlant Dis
January 2025
LSU AgCenter, Plant Pathology and Crop Physiology, Baton Rouge, Louisiana, United States.
In July 2023, panicle and leaf blight-like symptoms were observed from the rice () variety, PVL03, in research field plots in Louisiana (Rayne, LA 70578, USA; 30.21330⁰ N, 92.37309⁰ W).
View Article and Find Full Text PDFPlant Dis
January 2025
Kashi, Xinjiang, China, China;
Fig (Ficus carica L.) holds economic significance in Atushi, Xinjiang, but as fig cultivation expands, disease prevalence has risen. In July 2024, approximately 22% of harvested fig (cv.
View Article and Find Full Text PDFPlant Dis
January 2025
Institute of Plant Protection, Gansu Academy of Agricultural Sciences, Lanzhou, Gansu, China;
Astragalus mongholicus is a perennial Chinese medicinal herb in the family Leguminosae widely cultivated in China. In September 2023, A. mongholicus plants in a field in Weiyuan County, Gansu Province, showed symptoms of circular or irregular brown, sunken and necrotic lesions, multiple lesions coalesced, and brown longitudinal cracks in the roots.
View Article and Find Full Text PDFEnter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!