DeepCompression-caffe
DeepCompression-caffe copied to clipboard
Weired problem when pruning fc6 of caffenet
Hi, @may0324
Thanks for your cool work!
After cloned this code, we have been trying to replicate Han's work in pruning conv-nets, especially the large one like caffenet, a variant of AlexNet.
In pruning fc6, the first fully connected layer in the network, we observed that, this layer can be pruned with the sparsity at most 0.4444444, that is, when you set the parameter sparse_ratio
to any number larger than 0.45, the actual sparsity is 0.4444444. This is rather weird to me.
Have you ever encountered this problem? Or could you please share your advice?
Thanks again!
Cheers!