mmsegmentation-distiller
mmsegmentation-distiller copied to clipboard
This is a knowledge distillation toolbox based on mmsegmentation.
In mmseg.distillation.losses.cwd.ChannelWiseDivergence the student softmax output is not used (also self.name): `softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1)` Intended or a bug?
Hi, I happen to found that the results of PSPnet-r101 on Pascal VOC(mIoU=78.52, mAcc=79.57)is exactly the same as the official model provided by mmsegmentation. 1、Have you tested the pretrained model?...

Thank you for your excellent project! As I noticed, the experimental settings, like image crop size, batch size, training iterations are not the same as the distillation paper "channel wise...