pytorch-pruning icon indicating copy to clipboard operation
pytorch-pruning copied to clipboard

About gradient

Open liuhengli opened this issue 8 years ago • 2 comments

Excuse me, about gradient I have some not understand, why the gradient shapes same as activation output shape. And the gradient is not weight gradient? it shape is [i , o , 3, 3]?

liuhengli avatar Jul 13 '17 07:07 liuhengli

The gradient is the gradient of the output with respect to each one of the activation outputs. Therefore the gradient shape is the same as the activation outputs shape.

jacobgil avatar Jul 13 '17 15:07 jacobgil

@jacobgil , this place is difficult to understand. For example, gradient(final_loss, layer_weight) means the gradient of loss wrt layer weight, so the output of gradient keeps the same dimension. According to your comment, the final_loss is the output (that is the x = module(x) in your code)? and the layer_weight is each one of the activation outputs (what is this, can I find the corresponding variable in your code)? Thank you very much

guoxiaolu avatar Sep 30 '17 08:09 guoxiaolu