Torch-Pruning icon indicating copy to clipboard operation
Torch-Pruning copied to clipboard

Load error after pruning

Open Gusha-nye opened this issue 6 months ago • 0 comments

After pruning Qwen2.5-3B using the examples/llm/prune_llm.py script and saving to get the model, I tried to load the pruned model using AutoModelForCausalLM.from_pretrained(), but it failed, reporting the following error: Image

The official usage documentation notes point out that the model can be loaded with AutoModelForCausalLM.from_pretrained() after pruning? Image

My loaded inference code is shown below: Image Guys and gals, what should I do to fix this? Hope to get a reply from you all

Gusha-nye avatar Jun 17 '25 10:06 Gusha-nye