mmsegmentation-distiller
mmsegmentation-distiller copied to clipboard
ChannelWiseDivergence unused variable
In mmseg.distillation.losses.cwd.ChannelWiseDivergence the student softmax output is not used (also self.name):
softmax_pred_S = F.softmax(preds_S.view(-1,W*H)/self.tau, dim=1)
Intended or a bug?
Thanks for your attention !
The arg is deprecated.
If you want to distill model in OpenMMLab related repos, could join the wechat group in README.md