knowledge-distillation topic
Teacher-free-Knowledge-Distillation
Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization
torchdistill
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Traine...
SUOD
(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)
mammoth
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Collaborative-Distillation
[CVPR'20] Collaborative Distillation for Ultra-Resolution Universal Style Transfer (PyTorch)
matchmaker
Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch
microexpnet
MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images
RKD
Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019
EasyTransfer
EasyTransfer is designed to make the development of transfer learning in NLP applications easier.
neural-compressor
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime