Online-Knowledge-Distillation-via-Collaborative-Learning
Online-Knowledge-Distillation-via-Collaborative-Learning copied to clipboard
some mistake the original paper‘s peers models share the same low——level feature
some mistake the original paper‘s peers models share the same low——level feature according to your code,it seems like you simply ensemble the different models built in different arcitecture with independent low-level feature。
已收到你的邮件,我会尽快回复
Yes, the paper KDCL constructs parallel models without shared low features, same as Deep Mutual Learning. In general, it will achieve a better performance and it is easy for coding.