seq2seq-attn
seq2seq-attn copied to clipboard
code for seq-level distillation
Hi Yoon,
As mentioned in the Sequence-Level Knowledge Distillation, implementation of the distillation model is released in this repo, but I didn't find the corresponding code (both word-level distillation and seq-level distillation). Is it still under construction or there is anything I've missed. Please advise.
Regards,
I didn't find, too.