Online-Knowledge-Distillation-via-Collaborative-Learning icon indicating copy to clipboard operation
Online-Knowledge-Distillation-via-Collaborative-Learning copied to clipboard

some mistake the original paper‘s peers models share the same low——level feature

Open wslgqq277g opened this issue 2 years ago • 2 comments

some mistake the original paper‘s peers models share the same low——level feature according to your code,it seems like you simply ensemble the different models built in different arcitecture with independent low-level feature。

wslgqq277g avatar May 02 '22 12:05 wslgqq277g

已收到你的邮件,我会尽快回复

shaoeric avatar May 02 '22 12:05 shaoeric

Yes, the paper KDCL constructs parallel models without shared low features, same as Deep Mutual Learning. In general, it will achieve a better performance and it is easy for coding.

shaoeric avatar May 02 '22 12:05 shaoeric