sparse_learning icon indicating copy to clipboard operation
sparse_learning copied to clipboard

Does this library support NLP models such as Transformer

Open ghost opened this issue 5 years ago • 2 comments

Hi, I am interested in this work. I want to try this algorithm to accelerate trainning procedure of NLP models. So I want to know if I can directly use this library on NLP models? Thanks!

ghost avatar Mar 11 '20 00:03 ghost

Yes, it should work without any problem. You can just follow the steps of wrapping the transformer into the Masking class and it should work just fine. What is happening in the background is that all weights in the module (and all its sub-modules) are multiplied with a binary mask before each forward pass.

If you apply this to transformers you should make sure though that you keep the layer norm parameters dense. You can achieve this by using the remove_type(torch.nn.LayerNorm) method fo the Masking class.

Let me know if you run into any problems.

TimDettmers avatar Mar 11 '20 03:03 TimDettmers

Yes, it should work without any problem. You can just follow the steps of wrapping the transformer into the Masking class and it should work just fine. What is happening in the background is that all weights in the module (and all its sub-modules) are multiplied with a binary mask before each forward pass.

If you apply this to transformers you should make sure though that you keep the layer norm parameters dense. You can achieve this by using the remove_type(torch.nn.LayerNorm) method fo the Masking class.

Let me know if you run into any problems.

It's 2022 now, do you get any positive results?

nickyi1990 avatar Oct 10 '22 14:10 nickyi1990