distillation topic

List distillation repositories

PaddleSlim

1.5k
Stars
348
Forks
Watchers

PaddleSlim is an open-source library for deep model compression and architecture search.

mobile-yolov5-pruning-distillation

816
Stars
166
Forks
Watchers

mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!

continual-learning

1.5k
Stars
305
Forks
Watchers

PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.

brain-inspired-replay

218
Stars
63
Forks
Watchers

A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).

Keras_insightface

224
Stars
54
Forks
Watchers

Insightface Keras implementation

SimpleAICV_pytorch_training_examples

411
Stars
95
Forks
Watchers

SimpleAICV:pytorch training and testing examples.

CLUEPretrainedModels

792
Stars
96
Forks
Watchers

高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型

TextBrewer

1.6k
Stars
235
Forks
Watchers

A PyTorch-based knowledge distillation toolkit for natural language processing

DistilKoBERT

182
Stars
25
Forks
Watchers

Distillation of KoBERT from SKTBrain (Lightweight KoBERT)

WorldOnRails

160
Stars
27
Forks
Watchers

(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model