keras-grad-cam icon indicating copy to clipboard operation
keras-grad-cam copied to clipboard

Regarding the gradient

Open junkwhinger opened this issue 7 years ago • 0 comments

Hi, thanks for such a great blogpost. love it.

I'm trying to understand the paper's approach deeper with your code implementation. As far as I know, the paper suggests getting the gradient of the Softmax input w.r.t the target conv layer.

In your code I think it's referring to the output of the softmax layer loss = K.sum(model.layers[-1].output) I was wondering if this should be corrected as loss = K.sum(model.layers[-1].output.op.inputs[0]) and get the gradient with K.gradient function.

Please correct me if I've misunderstood the concept or your approach.

Thank you!

junkwhinger avatar Jan 11 '18 08:01 junkwhinger