knowledge-distillation topic

List knowledge-distillation repositories

BSS_distillation

70
Stars
11
Forks
Watchers

Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)

MLIC-KD-WSD

57
Stars
25
Forks
Watchers

Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)

Knowledge-Distillation-Toolkit

136
Stars
25
Forks
Watchers

:no_entry: [DEPRECATED] A knowledge distillation toolkit based on PyTorch and PyTorch Lightning.

MoTIS

116
Stars
10
Forks
Watchers

[NAACL 2022]Mobile Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP)

IFRNet

241
Stars
23
Forks
Watchers

IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)

FKD

176
Stars
31
Forks
Watchers

Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"

Nasty-Teacher

79
Stars
13
Forks
Watchers

[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang

ProSelfLC-AT

58
Stars
2
Forks
Watchers

noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.

ACCV_TinyGAN

74
Stars
9
Forks
Watchers

BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression

IL-SemSegm

57
Stars
2
Forks
Watchers

Code for the paper "Incremental Learning Techniques for Semantic Segmentation", Michieli U. and Zanuttigh P., ICCVW, 2019