Awesome-Knowledge-Distillation
Awesome-Knowledge-Distillation copied to clipboard
Please add a relevant paper from CVPR 2024
Hi @FLHonker I notice that there is no paper in 2024 listed yet. Could you please add a relevant paper from CVPR 2024 Highlight. The information is given below:
Title: Logit Standardization in Knowledge Distillation Paper: https://arxiv.org/abs/2403.01427 Github: https://github.com/sunshangquan/logit-standardization-KD Supplements: https://sunsean21.github.io/resources/cvpr2024_supp.pdf
The paper discusses the possibility of assigning temperatures distinctly between teacher/student and dynamically across samples. It then proposes a weighted Z-score logit standardization as a plug-and-play preprocess, capable of boosting the existing logit-based KD methods. Thank you for your attention.