model-compression topic
Data-Free-Adversarial-Distillation
Code and pretrained models for paper: Data-Free Adversarial Distillation
CompRess
Compressing Representations for Self-Supervised Learning
Keras_model_compression
Model Compression Based on Geoffery Hinton's Logit Regression Method in Keras applied to MNIST 16x compression over 0.95 percent accuracy.An Implementation of "Distilling the Knowledge in a Neural Net...
SViTE
[NeurIPS'21] "Chasing Sparsity in Vision Transformers: An End-to-End Exploration" by Tianlong Chen, Yu Cheng, Zhe Gan, Lu Yuan, Lei Zhang, Zhangyang Wang
ZAQ-code
CVPR 2021 : Zero-shot Adversarial Quantization (ZAQ)
BiPointNet
This project is the official implementation of our accepted ICLR 2021 paper BiPointNet: Binary Neural Network for Point Clouds.
ASSL
[NeurIPS'21 Spotlight] Aligned Structured Sparsity Learning for Efficient Image Super-Resolution (PyTorch)
Awesome-Pruning-at-Initialization
[IJCAI'22 Survey] Recent Advances on Neural Network Pruning at Initialization.
Smile-Pruning
A generic code base for neural network pruning, especially for pruning at initialization.
Structured-Bayesian-Pruning-pytorch
pytorch implementation of Structured Bayesian Pruning