knowledge-distillation topic

List knowledge-distillation repositories
trafficstars

Cream

1.6k
Stars
220
Forks
Watchers

This is a collection of our NAS and Vision Transformer work.

Distill-BERT-Textgen

130
Stars
18
Forks
Watchers

Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".

kdtf

140
Stars
46
Forks
Watchers

Knowledge Distillation using Tensorflow

Object-Detection-Knowledge-Distillation

212
Stars
51
Forks
Watchers

An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.

stagewise-knowledge-distillation

113
Stars
7
Forks
Watchers

Code implementation of Data Efficient Stagewise Knowledge Distillation paper.

MutualGuide

113
Stars
11
Forks
Watchers

Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection

MGD

197
Stars
23
Forks
Watchers

Masked Generative Distillation (ECCV 2022)

Repository to track the progress in model compression and acceleration

Knowledge_distillation_via_TF2.0

105
Stars
30
Forks
Watchers

The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API

DHM

84
Stars
18
Forks
Watchers

[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives