rachtibat
rachtibat
Hi Chris, Thank you for your support. I noticed in the [Hook source code](https://github.com/chr5tphr/zennit/blob/9defe12bcbd9a94919aabead9906bd531a79115a/src/zennit/core.py#L222), that you register a backward hook, but you don't save its handler. As far as I...
Hi Christopher, hope you're fine and I'm really glad that the zennit community grows, congratulation! With a growing community, more nn.Modules desire to be explained and that's why I'm writing...
torchvision.ops.FrozenBatchNorm2d is part of many SOTA models and contains the same attributes/buffers as the torch.nn.BatchNorm2d module. Hence, it can be canonized with the existing SequentialMergeBatchNorm class.
Hey Chris, hope you're well. I noticed an implementation detail where I am unsure if this was programmed on purpose and why. At [line](https://github.com/chr5tphr/zennit/blob/master/src/zennit/core.py#L527) you take in the backward pass...
Hey, we'd like to add a new rule that smooths the MaxPool2D operation by replacing it by an AveragePool2D backward pass: ``` class SmoothMaxPool2dRule(BasicHook): def __init__(self, epsilon=1e-6, zero_params=None): stabilizer_fn =...
Hey, I think it would be really nice, if we could check if the composite actually attached to all the modules we named and to see if there are any...
The new zennit version is incompatible with zennit-crp.
We know that activation != relevance. To localize concepts in input space, we initialize CRP with channel activations using the start_layer argument of the CondAttribution class. This works well for...
Hi, Thank you for this amazing library. Do you know a simple way to change the GPU device of ` param_f = lambda: param.image(self.image_shape[0], self.image_shape[1], batch=1, channels=channels)` In the file...
Hi, you could use ``` from webdriver_manager.chrome import ChromeDriverManager chromedriver = ChromeDriverManager().install() ``` instead of specifying the path to the chrome driver. Makes it easier (;