DiffKD icon indicating copy to clipboard operation
DiffKD copied to clipboard

Official implementation for paper "Knowledge Diffusion for Distillation", NeurIPS 2023

Results 10 DiffKD issues
Sort by recently updated
recently updated
newest added

Dear Hunto, I tried your method on my task. I found that while the original loss (student's output - ground truth) descended, the other losses, like autoencoder loss, diffusion loss...

Hi, @hunto. Thanks for your answers to my previous questions :[https://github.com/hunto/DiffKD/issues/3](url) Your work is very meaningful, and it can bring new changes to knowledge distillation. This led me to try...

thank you for your excellent works! In table 6, the student's parameters and FLOPs do not change. Is this because the student features are put into the head without going...

hello, I am a student studying knowledge distillation in semantic segmentation. I want to study your method with implementation detail. Is it possible to upload your segmentation task code?

hi @hunto I have been following your work for a long time and I am very excited that the code has been made public in the target classification task. but...

I am very interested in your work. May I know when the code will be released?

作者您好,我观察代码 我的理解是KD loss是由denoised student feature 和teacher feature计算的, 在回传的时候 这个KD loss是否也会和ddim loss一起参与diffusion loss的更新呢

DiffKD for object detection is cool! Could you provide the code for object detection? Thank you very much for your work.

Hello, the only result I reproduced on the CIFAR100 dataset using WRN_40_2 (teacher) and WRN_40_1 (student) is a classification accuracy of 72.79%, which is significantly different from the result of...

Hello, your work is impressive. I have a question: Is there an official training configuration file available to reproduce the experimental results on the CIFAR dataset?