knowledge-distillation topic

List knowledge-distillation repositories
trafficstars

Pretrained-Language-Model

3.0k
Stars
624
Forks
Watchers

Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.

knowledge-distillation-pytorch

1.8k
Stars
341
Forks
Watchers

A PyTorch implementation for exploring deep and shallow knowledge distillation (KD) experiments with flexibility

NeuronBlocks

1.4k
Stars
192
Forks
Watchers

NLP DNN Toolkit - Building Your NLP DNN Models Like Playing Lego

Knowledge-Distillation-Zoo

1.5k
Stars
262
Forks
Watchers

Pytorch implementation of various Knowledge Distillation (KD) methods.

Efficient-Deep-Learning

904
Stars
128
Forks
Watchers

Collection of recent methods on (deep) neural network compression and acceleration.

Efficient-Computing

1.1k
Stars
198
Forks
Watchers

Efficient computing methods developed by Huawei Noah's Ark Lab

KD_Lib

575
Stars
57
Forks
Watchers

A Pytorch Knowledge Distillation library for benchmarking and extending works in the domains of Knowledge Distillation, Pruning, and Quantization.

awesome-ai-infrastructures

370
Stars
71
Forks
Watchers

Infrastructures™ for Machine Learning Training/Inference in Production.

phpstan-deprecation-rules

370
Stars
71
Forks
Watchers

PHPStan rules for detecting usage of deprecated classes, methods, properties, constants and traits.

pytorch_classification

1.3k
Stars
338
Forks
Watchers

利用pytorch实现图像分类的一个完整的代码,训练,预测,TTA,模型融合,模型部署,cnn提取特征,svm或者随机森林等进行分类,模型蒸馏,一个完整的代码