distillation topic
PaddleSlim
PaddleSlim is an open-source library for deep model compression and architecture search.
mobile-yolov5-pruning-distillation
mobilev2-yolov5s剪枝、蒸馏,支持ncnn,tensorRT部署。ultra-light but better performence!
continual-learning
PyTorch implementation of various methods for continual learning (XdG, EWC, SI, LwF, FROMP, DGR, BI-R, ER, A-GEM, iCaRL, Generative Classifier) in three different scenarios.
brain-inspired-replay
A brain-inspired version of generative replay for continual learning with deep neural networks (e.g., class-incremental learning on CIFAR-100; PyTorch code).
Keras_insightface
Insightface Keras implementation
SimpleAICV_pytorch_training_examples
SimpleAICV:pytorch training and testing examples.
CLUEPretrainedModels
高质量中文预训练模型集合:最先进大模型、最快小模型、相似度专门模型
TextBrewer
A PyTorch-based knowledge distillation toolkit for natural language processing
DistilKoBERT
Distillation of KoBERT from SKTBrain (Lightweight KoBERT)
WorldOnRails
(ICCV 2021, Oral) RL and distillation in CARLA using a factorized world model