soft-filter-pruning
soft-filter-pruning copied to clipboard
Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks
https://github.com/he-y/soft-filter-pruning/blob/master/pruning_train.py#L450
why not layer1.0.conv2.weight in bottle_block_flag of get_small_model.py? only exist con1 and con3?
Theoretically speaking, when you prune the channels according to the output dimension, you shouldn't get any gradient for the corresponding weights during your backward pass. How do you solve this...
@he-y 谢谢作者分享代码,看了下Mask类,好像没找到如何处理残差结构中skip connection,如下图:  输入特征裁剪的通道和经过第二个conv之后的出来的特征裁剪的通道可能不一样,但它们需要相加,想请教下soft filter pruning是如何处理这种情况的?谢谢🙏
作者您好,Resnet18何Resnet34使用BasicBlock expansion = 1 所以第一层它不会使用downsample调整输出维度,如果使用您代码中残差连接的处理方式,会导致维度不匹配。 您在论文中Resnet18和Resnet34怎么处理的呢?