knowledge-distillation topic

List knowledge-distillation repositories

What-I-Have-Read

163
Stars
16
Forks
Watchers

Paper Lists, Notes and Slides, Focus on NLP. For summarization, please refer to https://github.com/xcfcode/Summarization-Papers

attention-transfer

1.4k
Stars
271
Forks
Watchers

Improving Convolutional Networks via Attention Transfer (ICLR 2017)

bert-in-production

91
Stars
13
Forks
Watchers

A collection of resources on using BERT (https://arxiv.org/abs/1810.04805 ) and related Language Models in production environments.

[NeurIPS-2021] Slow Learning and Fast Inference: Efficient Graph Similarity Computation via Knowledge Distillation

mmrazor

1.4k
Stars
218
Forks
Watchers

OpenMMLab Model Compression Toolbox and Benchmark.

fasterai

236
Stars
17
Forks
Watchers

FasterAI: Prune and Distill your models with FastAI and PyTorch

PaddleClas

5.3k
Stars
1.1k
Forks
Watchers

A treasure chest for visual classification and recognition powered by PaddlePaddle

distiller

215
Stars
29
Forks
Watchers

A large scale study of Knowledge Distillation.