unilm icon indicating copy to clipboard operation
unilm copied to clipboard

Any plans to release a sample code for MINI-LM distillation?

Open leimao opened this issue 5 years ago • 14 comments

Hi,

Thank you for releasing the distilled MINI-LM models from pre-trained Transformer models. I wonder if you have any plans to release a sample code for MINI-LM distillation implementations in either TensorFlow or PyTorch? If people have different pre-trained Transformer models and they want to distill the knowledge, they might be benefited from such scripts. Thank you very much.

Best,

Lei

leimao avatar Apr 06 '20 22:04 leimao

@WenhuiWang0824

donglixp avatar Apr 07 '20 15:04 donglixp

Hi, I was just wondering if there is any update regarding this? Having these scripts would be amazingly helpful! 🙂

Thank you so much!

VenkatKS avatar Aug 04 '20 22:08 VenkatKS

Do you have any update on this please ?

padipadou avatar Mar 05 '21 10:03 padipadou

@WenhuiWang0824

Bump on this thread. Any update on this would be appreciated. I think there are many researchers and the community at large that would benefit from being able to distill non-BERT LMs.

joshdevins avatar Jun 10 '21 13:06 joshdevins

@donglixp Any update on this?

joshdevins avatar Jun 28 '21 13:06 joshdevins

Yup, I am still interested in these scripts as well, if at all possible to release 🙂

VenkatKS avatar Jun 29 '21 01:06 VenkatKS

+1

luoqishuai avatar Jul 30 '21 09:07 luoqishuai

+1

wasiahmad avatar Aug 18 '21 21:08 wasiahmad

+1

sawwhite avatar Jan 20 '22 06:01 sawwhite

+1

mrpeerat avatar Jan 30 '22 03:01 mrpeerat

+1

dreamgonfly avatar Feb 04 '22 11:02 dreamgonfly

+1

joczu avatar Feb 07 '22 06:02 joczu

It looks like the authors have not yet released the distillation scripts, right?

xwuShirley avatar Mar 24 '22 21:03 xwuShirley

+1

xs1997zju avatar Nov 24 '22 08:11 xs1997zju