Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
×
Feb 24, 2021 · Abstract:Knowledge distillation (KD) has witnessed its powerful capability in learning compact models in object detection.
In this section, we give a brief review on the related works, including bounding box regression, localization quality estimation, and knowledge distillation.
In this section, we give a brief review on the related works, including bounding box regression, localization quality estimation, and knowledge distillation.
Our distillation scheme is simple as well as effective and can be easily applied to different dense object detectors. Experiments show that our LD can boost the ...
Feb 24, 2023 · Our distillation scheme is simple as well as effective and can be easily applied to both dense horizontal object detectors and rotated object ...
In this paper, we propose a flexible localization distillation for dense object detection and a selective region distillation based on a new valuable ...
This paper reformulates the knowledge distillation process on localization to present a novel localization distillation method which can efficiently ...
People also ask
Localization Distillation for Dense Object Detection. Supplementary Materials ... We also provide experiment results on another popular object detection benchmark ...
Aug 1, 2023 · In this paper, we investigate whether logit mimicking always lags behind feature imitation. Towards this goal, we first present a novel ...
LD stably improves over rotated detectors without adding any computational cost! Introduction. Previous knowledge distillation (KD) methods for object detection ...