nn_pruning icon indicating copy to clipboard operation
nn_pruning copied to clipboard

Prune a model while finetuning or training.

Results 20 nn_pruning issues
Sort by recently updated
recently updated
newest added

Hi @madlag @julien-c @co42 @srush @Narsil I am trying to use `nn_pruning` for Pruning different transformer models. Code: ``` model_checkpoint = "t5-small" t5small_model = AutoModelForSeq2SeqLM.from_pretrained(model_checkpoint).to(device) mpc.patch_model(t5small_model) t5small_model.save_pretrained("models/patched") ``` Error: ```...

Hi, I am working to prune BART model for seq2seq purpose. Currently, I have replaced this [code](https://github.com/huggingface/nn_pruning/blob/main/notebooks/01-sparse-trainer.ipynb) with BART based functionalities. After executing I am getting drop in number of...

Hello, Thanks for the amazing repo! I'm wondering what is the difference between "finetune" and "final-finetune" in `/example`. Do we train the model and the mask score in the finetune...

The tests in nn_pruning/tests/test_quantization.py fail. A fix for this was proposed in PR #38

This PR fixes the nn_pruning/tests/test_quantization.py tests. How: - Fixing the `symbolic_trace` import in nn_pruning/modules/quantization.py - Finding a working combination of transformers and torch versions and fixing those in setup.py. Note...

As the document tested the BERT models and got good result, one question is this nn_pruning methods can be applied to other Transformer models, like Google ViT, Swin Transformer and...

when test quantization, it raises errors. May I ask if anyone has encountered this problem? pytorch==3.8.1 transformers==4.7.0

Can you please provide a dockerfile or requirement.txt to reproduce the results. I installed this library but had to make a few changes to run the example notebooks. On running...

Hi there @madlag, Thanks for your great work! It seems there is a problem for MNLI if we update text_classification/parameters.json with **do_train: 0** and run following, ``` mkdir result export...