knowledge-distillation topic
BSS_distillation
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
MLIC-KD-WSD
Multi-Label Image Classification via Knowledge Distillation from Weakly-Supervised Detection (ACM MM 2018)
Knowledge-Distillation-Toolkit
:no_entry: [DEPRECATED] A knowledge distillation toolkit based on PyTorch and PyTorch Lightning.
MoTIS
[NAACL 2022]Mobile Text-to-Image search powered by multimodal semantic representation models(e.g., OpenAI's CLIP)
IFRNet
IFRNet: Intermediate Feature Refine Network for Efficient Frame Interpolation (CVPR 2022)
FKD
Official code for our ECCV'22 paper "A Fast Knowledge Distillation Framework for Visual Recognition"
Nasty-Teacher
[ICLR 2021 Spotlight Oral] "Undistillable: Making A Nasty Teacher That CANNOT teach students", Haoyu Ma, Tianlong Chen, Ting-Kuei Hu, Chenyu You, Xiaohui Xie, Zhangyang Wang
ProSelfLC-AT
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
ACCV_TinyGAN
BigGAN; Knowledge Distillation; Black-Box; Fast Training; 16x compression
IL-SemSegm
Code for the paper "Incremental Learning Techniques for Semantic Segmentation", Michieli U. and Zanuttigh P., ICCVW, 2019