KD_Lib icon indicating copy to clipboard operation
KD_Lib copied to clipboard

Paper: Data-Distortion Guided Self-Distillation for Deep Neural Networks

Open yiqings opened this issue 2 years ago • 2 comments

  • Paper: Data-Distortion Guided Self-Distillation for Deep Neural Networks
  • Paper Link: https://ojs.aaai.org/index.php/AAAI/article/download/4498/4376

Description

1. A self distillation scheme built upon distilling different augmented/distorted images by the same student. 
2.A MMD loss distilling the features between different augmented/distorted images

Modifications

Probably removing the MMD loss and only retain the KL loss is fine,
since it can already demonstrate competitive performance.

The methods shows to be a very powerful self-distillation scheme, even with the absence of MMD loss, with my my local experiments on CIFAR10/100.

Plus, it also demonstrate a strong compatibility with other distillation scheme, and can perform as a component.

yiqings avatar Mar 29 '22 23:03 yiqings

https://github.com/youngerous/ddgsd-pytorch provides an unofficial implementation.

yiqings avatar Mar 29 '22 23:03 yiqings

Hi @yiqings, thanks for raising this issue. Unfortunately, development for KD-Lib has stalled for now, but we will be sure to keep this issue in mind when / if we resume. Also, do let me know if you would be interested in contributing an implementation for this paper.

NeelayS avatar Mar 30 '22 05:03 NeelayS