Knowledge-Distillation-Paper
Knowledge-Distillation-Paper copied to clipboard
This resposity maintains a collection of important papers on knowledge distillation (awesome-knowledge-distillation)).
Results
1
Knowledge-Distillation-Paper issues
Sort by
recently updated
recently updated
newest added
Great work. I would like to introduce two papers: Name: Weight-Inherited Distillation for Task-Agnostic BERT Compression paper: code: https://github.com/wutaiqiang/WID-NAACL2024 Blog: https://zhuanlan.zhihu.com/p/687294843 TL, DR: 使用权重继承的思路来实现模型压缩, 直接学习一个映射,将教师模型的权重映射到学生模型。 Name: Rethinking Kullback-Leibler Divergence in...