knowledge-distillation topic

List knowledge-distillation repositories

pytorch-be-your-own-teacher

144
Stars
27
Forks
Watchers

A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094

OKDDip

72
Stars
12
Forks
Watchers

[AAAI-2020] Official implementation for "Online Knowledge Distillation with Diverse Peers".

VKD

72
Stars
15
Forks
Watchers

PyTorch code for ECCV 2020 paper: "Robust Re-Identification by Multiple Views Knowledge Distillation"

Knowledge-Distillation

60
Stars
18
Forks
Watchers

Blog https://medium.com/neuralmachine/knowledge-distillation-dc241d7c2322

MetaDistil

79
Stars
14
Forks
Watchers

Code for ACL 2022 paper "BERT Learns to Teach: Knowledge Distillation with Meta Learning".

SemCKD

75
Stars
14
Forks
Watchers

[AAAI-2021, TKDE-2023] Official implementation for "Cross-Layer Distillation with Semantic Calibration".

Rotated-LD

44
Stars
2
Forks
Watchers

Rotated Localization Distillation (CVPR 2022, TPAMI 2023)

PocketNet

56
Stars
11
Forks
Watchers

Official repository for PocketNet: Extreme Lightweight Face Recognition Network using Neural Architecture Search and Multi-Step Knowledge Distillation

awesome-nlp-references

34
Stars
5
Forks
Watchers

A curated list of resources dedicated to Knowledge Distillation, Recommendation System, especially Natural Language Processing (NLP).