knowledge-distillation topic
pytorch-be-your-own-teacher
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
OKDDip
[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".
VKD
PyTorch code for ECCV 2020 paper: "Robust Re-Identification by Multiple Views Knowledge Distillation"
Knowledge-Distillation
Blog https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322
MetaDistil
Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".
SemCKD
[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".
Rotated-LD
Rotated Localization Distillation (CVPR 2022, TPAMI 2023)
PocketNet
Official repository for PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation
awesome-nlp-references
A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).