caffe
caffe copied to clipboard
additional details on ResNet20 (low rank)
Would you please share speed up ratios and final ranks on ResNet20 (CIFAR) obtained in Coordinating Filters for Faster Deep Neural Networks paper?
Baseline | 8 | 13 | 13 | 14 | 14 | 14 | 13 | 23 | 26 | 27 | 28 | 28 | 28 | 47 | 54 | 56 | 55 | 56 | 31 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
L2-norm force | 8 | 9 | 10 | 10 | 10 | 9 | 10 | 8 | 9 | 9 | 10 | 9 | 8 | 2 | 1 | 1 | 1 | 1 | 1 |
Full rank | 16 | 16 | 16 | 16 | 16 | 16 | 16 | 32 | 32 | 32 | 32 | 32 | 32 | 64 | 64 | 64 | 64 | 64 | 64 |
Thanks a lot, and what is the accuracy of this model? Is is 7.97% as you report in paper? Or it is something else, and in paper plot was for one particular regularization value and acc was for another. Could you please elaborate?
And one more question, you gave 19 values, but network has 22 layers, does it mean that for other layer no parametrization happens?
It's the one in Figure 5, 8.82% for baseline and 9.57% for ours. Force regularization is only applied on conv layers and indexed in order in the table.
What is the lambda value you used to obtain that result? I don't see any particular values in any of the scripts.