Seunghyun Lee
Seunghyun Lee
KD_methods_with_TF
Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)
Knowledge_distillation_via_TF2.0
The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API
GALA_TF2.0
Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019
Lightweighting_Cookbook
This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.
TF2-jit-compile-on-multi-gpu
Tensorflow2 training code with jit compiling on multi-GPU.
Variational_Information_Distillation
Reproducing VID in CVPR2019 (on working)
Zero-shot_Knowledge_Distillation
Zero-Shot Knowledge Distillation in Deep Networks in ICML2019
EKG
Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning