knowledge-distillation topic

List knowledge-distillation repositories

knowledge_evolution

83
Stars
15
Forks
Watchers

(CVPR-Oral 2021) PyTorch implementation of Knowledge Evolution approach and Split-Nets

URL

122
Stars
17
Forks
Watchers

Universal Representation Learning from Multiple Domains for Few-shot Classification - ICCV 2021, Cross-domain Few-shot Learning with Task-specific Adapters - CVPR 2022

FunMatch-Distillation

82
Stars
8
Forks
Watchers

TF2 implementation of knowledge distillation using the "function matching" hypothesis from https://arxiv.org/abs/2106.05237.

[CVPR 2022] Learning Multiple Adverse Weather Removal via Two-stage Knowledge Learning and Multi-contrastive Regularization: Toward a Unified Model

Data-Free-Adversarial-Distillation

95
Stars
18
Forks
Watchers

Code and pretrained models for paper: Data-Free Adversarial Distillation

CompRess

77
Stars
12
Forks
Watchers

Compressing Representations for Self-Supervised Learning

BAKE

81
Stars
4
Forks
Watchers

Self-distillation with Batch Knowledge Ensembling Improves ImageNet Classification

mgd

64
Stars
13
Forks
Watchers

Matching Guided Distillation (ECCV 2020)

DiscoNet

130
Stars
17
Forks
Watchers

[NeurIPS2021] Learning Distilled Collaboration Graph for Multi-Agent Perception

MultilangStructureKD

72
Stars
9
Forks
Watchers

[ACL 2020] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling