knowledge-distillation topic

List knowledge-distillation repositories
trafficstars

overhaul-distillation

410
Stars
77
Forks
Watchers

Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)

fast-human-pose-estimation.pytorch

396
Stars
66
Forks
Watchers

Official pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419

mdistiller

789
Stars
120
Forks
Watchers

The official implementation of [CVPR2022] Decoupled Knowledge Distillation https://arxiv.org/abs/2203.08679 and [ICCV2023] DOT: A Distillation-Oriented Trainer https://openaccess.thecvf.com/content/I...

Fast_Human_Pose_Estimation_Pytorch

325
Stars
53
Forks
Watchers

Pytorch Code for CVPR2019 paper "Fast Human Pose Estimation" https://arxiv.org/abs/1811.05419

KD_methods_with_TF

265
Stars
59
Forks
Watchers

Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)

LD

346
Stars
50
Forks
Watchers

Localization Distillation for Object Detection (CVPR 2022, TPAMI 2023)

Teacher-Assistant-Knowledge-Distillation

249
Stars
48
Forks
Watchers

Using Teacher Assistants to Improve Knowledge Distillation: https://arxiv.org/pdf/1902.03393.pdf

FGD

334
Stars
44
Forks
Watchers

Focal and Global Knowledge Distillation for Detectors (CVPR 2022)

KnowledgeDistillation

207
Stars
52
Forks
Watchers

Knowledge distillation in text classification with pytorch. 知识蒸馏,中文文本分类,教师模型BERT、XLNET,学生模型biLSTM。

AB_distillation

103
Stars
18
Forks
Watchers

Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)