mmsegmentation-distiller
mmsegmentation-distiller copied to clipboard
This is a knowledge distillation toolbox based on mmsegmentation.
mmsegmentation-distiller
The repo will be deprecated !!!!!
If you want to distill model in OpenMMLab related repos, please use MMRazor!!
If you are intrested in KD,you also could contact me by Wechat, and I will invite you to the KD group.
This project is based on mmsegmentation(v-0.11.0), all the usage is the same as mmsegmentation including training , test and so on.
Distiller Zoo
- [x] Channel-wise Distillation for Semantic Segmentation
- [ ] Structured Knowledge Distillation for Semantic Segmentation
Installation
-
Set up a new conda environment:
conda create -n distiller python=3.7
-
Install pytorch 1.3+
-
Install mmdetection-distiller
git clone https://github.com/pppppM/mmsegmentation-distiller.git cd mmsegmentation-distiller pip install -r requirements/build.txt pip install -v -e .
Train
#single GPU
python tools/train.py configs/distiller/cwd/cwd_psp_r101-d8_distill_psp_r18_d8_512_1024_80k_cityscapes.py
#multi GPU
bash tools/dist_train.sh configs/distillers/cwd/cwd_psp_r101-d8_distill_psp_r18_d8_512_1024_80k_cityscapes.py 8
Test
#single GPU
python tools/test.py configs/distillers/cwd/cwd_psp_r101-d8_distill_psp_r18_d8_512_1024_80k_cityscapes.py $CHECKPOINT --eval mIoU
#multi GPU
bash tools/dist_test.sh configs/distillers/cwd/cwd_psp_r101-d8_distill_psp_r18_d8_512_1024_80k_cityscapes.py $CHECKPOINT 8 --eval mIoU
Lisence
This project is released under the Apache 2.0 license.