knowledge-distillation topic
knowledge_evolution
(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets
URL
Universal Representation Learning from Multiple Domains for Few-shot Classification - ICCV 2021, Cross-domain Few-shot Learning with Task-specific Adapters - CVPR 2022
FunMatch-Distillation
TF2 implementation of knowledge distillation using the "function matching" hypothesis from https://arxiv.org/abs/2106.05237.
Two-stage-Knowledge-For-Multiple-Adverse-Weather-Removal
[CVPR 2022] Learning Multiple Adverse Weather Removal via Two-stage Knowledge Learning and Multi-contrastive Regularization: Toward a Unified Model
Data-Free-Adversarial-Distillation
Code and pretrained models for paper: Data-Free Adversarial Distillation
CompRess
Compressing Representations for Self-Supervised Learning
BAKE
Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification
mgd
Matching Guided Distillation (ECCV 2020)
DiscoNet
[NeurIPS2021] Learning Distilled Collaboration Graph for Multi-Agent Perception
MultilangStructureKD
[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling