Torch-Pruning icon indicating copy to clipboard operation
Torch-Pruning copied to clipboard

Global unstructured pruning

Open issakh opened this issue 2 years ago • 4 comments

Hi, how can I implement global unstructured pruning using this library? It seems I can only prune individual layers and not the entire model

Thanks

issakh avatar Jan 28 '22 15:01 issakh

@issakh Global unstructured pruning is supported by Pytorch. See the tutorial.

The only reason we do structured pruning is to reduce model size overall. Unstructured pruning is only masking.

iamanigeeit avatar Mar 24 '22 12:03 iamanigeeit

Hi, thanks for your response. I think I made a mistake in my question in terms of structured vs unstructured. Basically, currently we can only decide to prune say 10% of the parameters of layer x and remove the channels. Is there an implementation which allows me to say remove 10% of the parameters of the entire model but the % amount of parameters per layer is not necessarily that as some layers may be more prune-able than others?

issakh avatar Mar 24 '22 13:03 issakh

Hi @issakh, maybe you can try the pruner from https://github.com/zju-vipa/KamalEngine/tree/master/kamal/slim/prunning. It randomly prunes parameters of the entire model according to the given pruning rate.

VainF avatar Mar 25 '22 06:03 VainF

@issakh Heh, lots of people ask this as well, see here

The question is how do you adjust the number of parameters to prune for each layer when the layer dimensions and importance are all different? This paper says they tried to normalize by biggest value in the layer or layer dimensions but they didn't report results.

Also, they tried different criteria for pruning and decided that L1 norm was still the best, so that makes life easier.

I am working on an implementation that estimates the # of parameters based on global unstructured pruning. Basically i assume that the % of params to drop is the same as if i did unstructured pruning.

iamanigeeit avatar Mar 25 '22 11:03 iamanigeeit