Seunghyun Lee

Results 10 repositories owned by Seunghyun Lee

KD_methods_with_TF

265
Stars
59
Forks
Watchers

Knowledge distillation methods implemented with Tensorflow (now there are 11 (+1) methods, and will be added more.)

Knowledge_distillation_via_TF2.0

105
Stars
30
Forks
Watchers

The codes for recent knowledge distillation algorithms and benchmark results via TF2.0 low-level API

ADNet-tensorflow

16
Stars
9
Forks
Watchers

GALA_TF2.0

43
Stars
6
Forks
Watchers

Tensorflow 2.0 implementation of "Symmetric Graph Convolutional Autoencoder for Unsupervised Graph Representation Learning" in ICCV2019

Lightweighting_Cookbook

23
Stars
2
Forks
Watchers

This project attempts to build neural network training and lightweighting cookbook including three kinds of lightweighting solutions, i.e., knowledge distillation, filter pruning, and quantization.

TF2-jit-compile-on-multi-gpu

17
Stars
2
Forks
Watchers

Tensorflow2 training code with jit compiling on multi-GPU.

Variational_Information_Distillation

20
Stars
0
Forks
Watchers

Reproducing VID in CVPR2019 (on working)

Zero-shot_Knowledge_Distillation

49
Stars
9
Forks
Watchers

Zero-Shot Knowledge Distillation in Deep Networks in ICML2019

EKG

17
Stars
1
Forks
Watchers

Ensemble Knowledge Guided Sub-network Search and Fine-tuning for Filter Pruning