distiller icon indicating copy to clipboard operation
distiller copied to clipboard

--load-serialized will make model fail to prune

Open Little0o0 opened this issue 3 years ago • 1 comments

I found model without DataParallel wrapping it will fail to prune. i.e. --load-serialized will disable pruning.

When I run

python compress_classifier.py -a=resnet20_cifar -p=50 ../../../data/cifar10/ -j=22 --epochs=1 --lr=0.001 --masks-sparsity --compress=../agp-pruning/resnet18.schedule_agp.yaml --load-serialized

The total sparsity will always be 0.00

Total sparsity: 0.00

But if I run the same command line without --load-serialized

python compress_classifier.py -a=resnet20_cifar -p=50 ../../../data/cifar10/ -j=22 --epochs=1 --lr=0.001 --masks-sparsity --compress=../agp-pruning/resnet18.schedule_agp.yaml

The total sparsity will be 1.53 after 1 epoch

Total sparsity: 1.53

Little0o0 avatar Mar 21 '22 02:03 Little0o0

I found model = torch.nn.DataParallel(model, device_ids=device_ids) is necessary for pruning but I do not know the reason.

Little0o0 avatar Mar 21 '22 02:03 Little0o0