network-compression topic
distiller
Neural Network Distiller by Intel AI Lab: a Python package for neural network compression research. https://intellabs.github.io/distiller
awesome-quantization-and-fixed-point-training
Neural Network Quantization & Low-Bit Fixed Point Training For Hardware-Friendly Algorithm Design
overhaul-distillation
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
aimet
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
permute-quantize-finetune
Using ideas from product quantization for state-of-the-art neural network compression.
FisherPruning
Group Fisher Pruning for Practical Network Compression(ICML2021)
AB_distillation
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
model_optimization
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers adva...
musco-pytorch
MUSCO: MUlti-Stage COmpression of neural networks
BSS_distillation
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)