knowledge-distillation topic

List knowledge-distillation repositories
trafficstars

Teacher-free-Knowledge-Distillation

569
Stars
68
Forks
Watchers

Knowledge Distillation: CVPR2020 Oral, Revisiting Knowledge Distillation via Label Smoothing Regularization

torchdistill

1.3k
Stars
124
Forks
Watchers

A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Traine...

SUOD

373
Stars
48
Forks
Watchers

(MLSys' 21) An Acceleration System for Large-scare Unsupervised Heterogeneous Outlier Detection (Anomaly Detection)

mammoth

476
Stars
83
Forks
Watchers

An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning

Collaborative-Distillation

184
Stars
23
Forks
Watchers

[CVPR'20] Collaborative Distillation for Ultra-Resolution Universal Style Transfer (PyTorch)

matchmaker

256
Stars
30
Forks
Watchers

Training & evaluation library for text-based neural re-ranking and dense retrieval models built with PyTorch

microexpnet

139
Stars
26
Forks
Watchers

MicroExpNet: An Extremely Small and Fast Model For Expression Recognition From Frontal Face Images

RKD

382
Stars
49
Forks
Watchers

Official pytorch Implementation of Relational Knowledge Distillation, CVPR 2019

EasyTransfer

852
Stars
161
Forks
Watchers

EasyTransfer is designed to make the development of transfer learning in NLP applications easier.

neural-compressor

2.2k
Stars
254
Forks
Watchers

SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime