knowledge-distillation topic

List knowledge-distillation repositories

2DPASS

386
Stars
52
Forks
Watchers

2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) :fire:

efficient-bert

32
Stars
4
Forks
Watchers

This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".

SSAN

29
Stars
9
Forks
Watchers

[ACMMM 2020] Code release for "Simultaneous Semantic Alignment Network for Heterogenous Domain Adaptation" https://arxiv.org/abs/2008.01677

easy-bert

68
Stars
12
Forks
Watchers

easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代码注释,也很适合学习

dynamic-cdfsl

28
Stars
5
Forks
Watchers

Pytorch codes for NeurIPS 2021 paper titled "Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data".

DICOD

29
Stars
3
Forks
Watchers

Official Pytorch implementation for Distilling Image Classifiers in Object detection (NeurIPS2021)

DIODE

60
Stars
6
Forks
Watchers

Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.

Knowledge-Distillation-in-Keras

49
Stars
17
Forks
Watchers

Demonstrates knowledge distillation for image-based models in Keras.

DCM

31
Stars
4
Forks
Watchers

Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)