CoreKD: A Context-Aware Local Region Structural Contrastive Knowledge Distillation Framework for Object Detection

Knowledge distillation (KD) aims to transfer knowledge from a cumbersome teacher to a lightweight student, thereby reducing overall model complexity without sacrificing performance. Current methods tend to focus excessively on pixel-level knowledge transfer while overlooking localized and contextual information. To address this, we propose a novel context-aware local region structural contrastive knowledge distillation framework (CoreKD) for object detection tasks. Specifically, we introduce a p