knowledge-distillation topic
Cream
This is a collection of our NAS and Vision Transformer work.
Distill-BERT-Textgen
Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".
kdtf
Knowledge Distillation using Tensorflow
Object-Detection-Knowledge-Distillation
An Object Detection Knowledge Distillation framework powered by pytorch, now having SSD and yolov5.
stagewise-knowledge-distillation
Code implementation of Data Efficient Stagewise Knowledge Distillation paper.
MutualGuide
Localize to Classify and Classify to Localize: Mutual Guidance in Object Detection
MGD
Masked Generative Distillation (ECCV 2022)
model-compression-and-acceleration-progress
Repository to track the progress in model compression and acceleration
Knowledge_distillation_via_TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
DHM
[CVPR 2020] Dynamic Hierarchical Mimicking Towards Consistent Optimization Objectives