π Congratulations on Paper Acceptance at ICME 2025!
Mar 21, 2025Β·
,,,,,,Β·
1 min read
Dacheng Qi
Huayu Zhang
Yufeng Wang
Shuangkang Fang
Zehao Zhang
Zesheng Wang
Wenrui Ding

This paper proposes Student-Driven Knowledge Distillation (SDKD), a novel method for enhancing knowledge distillation by introducing a proxy teacher modeled after the student network. Unlike traditional approaches that rely solely on the teacher network, SDKD bridges the structural gap between teacher and student using a Feature Fusion Block to transfer both feature- and response-based knowledge. Extensive experiments across classification, semantic segmentation, and depth estimation tasks demonstrate that SDKD consistently outperforms 29 state-of-the-art methods, offering a more efficient and effective distillation process tailored to the student’s structure.