knowledge-distillation topic
2DPASS
2DPASS: 2D Priors Assisted Semantic Segmentation on LiDAR Point Clouds (ECCV 2022) :fire:
efficient-bert
This repository contains the code for the paper in Findings of EMNLP 2021: "EfficientBERT: Progressively Searching Multilayer Perceptron via Warm-up Knowledge Distillation".
MSDN
Official PyTorch Implementation of MSDN (CVPR'22)
SSAN
[ACMMM 2020] Code release for "Simultaneous Semantic Alignment Network for Heterogenous Domain Adaptation" https://arxiv.org/abs/2008.01677
easy-bert
easy-bert是一个中文NLP工具,提供诸多bert变体调用和调参方法,极速上手;清晰的设计和代码注释,也很适合学习
dynamic-cdfsl
Pytorch codes for NeurIPS 2021 paper titled "Dynamic Distillation Network for Cross-Domain Few-Shot Recognition with Unlabeled Data".
DICOD
Official Pytorch implementation for Distilling Image Classifiers in Object detection (NeurIPS2021)
DIODE
Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
Knowledge-Distillation-in-Keras
Demonstrates knowledge distillation for image-based models in Keras.
DCM
Knowledge Transfer via Dense Cross-layer Mutual-distillation (ECCV'2020)