unilm
unilm copied to clipboard
Any plans to release a sample code for MINI-LM distillation?
Hi,
Thank you for releasing the distilled MINI-LM models from pre-trained Transformer models. I wonder if you have any plans to release a sample code for MINI-LM distillation implementations in either TensorFlow or PyTorch? If people have different pre-trained Transformer models and they want to distill the knowledge, they might be benefited from such scripts. Thank you very much.
Best,
Lei
@WenhuiWang0824
Hi, I was just wondering if there is any update regarding this? Having these scripts would be amazingly helpful! 🙂
Thank you so much!
Do you have any update on this please ?
@WenhuiWang0824
Bump on this thread. Any update on this would be appreciated. I think there are many researchers and the community at large that would benefit from being able to distill non-BERT LMs.
@donglixp Any update on this?
Yup, I am still interested in these scripts as well, if at all possible to release 🙂
+1
+1
+1
+1
+1
+1
It looks like the authors have not yet released the distillation scripts, right?
+1