mmsegmentation-distiller icon indicating copy to clipboard operation
mmsegmentation-distiller copied to clipboard

This is a knowledge distillation toolbox based on mmsegmentation.

Results 14 mmsegmentation-distiller issues
Sort by recently updated
recently updated
newest added

大佬,用cwd在目标检测上进行蒸馏的超参数是什么样。我用了detection仓库中配置文件给的t=1,α=5. 训练结果只有0.0几。

Is this project still being maintained, please? I tried to join the wechat group chat via the QR code but never got through.

# Patching CVE-2007-4559 Hi, we are security researchers from the Advanced Research Center at [Trellix](https://www.trellix.com). We have began a campaign to patch a widespread bug named CVE-2007-4559. CVE-2007-4559 is a...

hi,请问作者 在论文里你用到了KL去学习Teacher的输出。 Equation (4) ![2021-11-26 12-03-51屏幕截图](https://user-images.githubusercontent.com/26001890/143525116-0886cdf7-60f2-49ba-b1c9-a3eaf7af0600.png) 在你的代码里 ` loss = torch.sum(-softmax_pred_T * logsoftmax(preds_S.view(-1, W * H) / self.tau)) * ( self.tau**2) ` 你使用的这个公式去计算KD loss。 但是标准的KL Loss应该是这样 ` kl_loss = torch.sum(softmax_pred_T...

Hello! Recently, we are studying semantic segmentation and knowledge distillation. Can we update the QR code of the discussion group?

Thanks for your attention ! If you want to distill model in OpenMMLab related repos, could join the wechat group in README.md

使用train中的代码进行蒸馏训练,看日志loss开始会下降一点,后面一致在44.几震荡,感觉不收敛,是设置的不对吗

I have tested the pretrained Teacher model PSPNet+Res101 and got 73+ val mIOU. It is not consistent with the original paper. Could you help me? Thanks!

Hi, I would like to ask how to support the pytorch1.9 in this method

when i use a different network, how to distiller my own network structure? can you supply some basic steps to do this? and i see the build_distiller function need teacher_cfg...