knowledge-distillation topic
aquvitae
Knowledge Distillation Toolkit
What-I-Have-Read
Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers
attention-transfer
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
bert-in-production
A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.
Efficient_Graph_Similarity_Computation
[NeurIPS-2021] Slow Learning and Fast Inference: Efficient Graph Similarity Computation via Knowledge Distillation
mmrazor
OpenMMLab Model Compression Toolbox and Benchmark.
fasterai
FasterAI: Prune and Distill your models with FastAI and PyTorch
PaddleClas
A treasure chest for visual classification and recognition powered by PaddlePaddle
distiller
A large scale study of Knowledge Distillation.
awesome-knowledge-distillation
Awesome Knowledge Distillation