zennit icon indicating copy to clipboard operation
zennit copied to clipboard

Zennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.

Results 34 zennit issues
Sort by recently updated
recently updated
newest added

- support computing the gradient of the modified gradient via second order gradients - for this, the second gradient pass must be done without hooks ## TODO - finalize -...

I am currently working on supporting second-order gradients, i.e. gradients of the modified gradients, which is used for example to compute adversarial explanations. The current issue which prevents second order...

`Composite.context` can be implemented slimmer/simpler using `contextlib.contextmanager`. Furthermore, instead of calling `Composite.context`, the same functionality could be implemented as `Composite.__call__`, as the context is the main functionality, and this would...

enhancement
core

Calculating the relevance on ResNet50 seems to be prone to a numerical instability, producing heatmaps where all attribution is concentrated in a few spots because the values in those spots...

model compatibility

Some modules are implicitly mapped to the gradient. We can explicitly map Module types to `None` in their respective `module_map` in composites and warn the user when no rule is...

enhancement
core

I am currently working on `GradOutHook`, which is different from the current `zennit.core.Hook` in that instead of overwriting the full gradient of the module, it only changes the gradient output....

enhancement
core

Hi Chris, Thank you for your support. I noticed in the [Hook source code](https://github.com/chr5tphr/zennit/blob/9defe12bcbd9a94919aabead9906bd531a79115a/src/zennit/core.py#L222), that you register a backward hook, but you don't save its handler. As far as I...

I want to use LRP to explain the semantic segmentation task using Unet model (Pytorch). I tested the LRP in captum but not support `nn.Upsample` and `nn.ConvTranspose2d`. I would like...

This way it is possible to specify a fallback layer map if we want to apply more general rules to the remaining layers. The parameter is optional. If None is...

Hi Christopher, hope you're fine and I'm really glad that the zennit community grows, congratulation! With a growing community, more nn.Modules desire to be explained and that's why I'm writing...

model compatibility