captum icon indicating copy to clipboard operation
captum copied to clipboard

LRP fails with custom rule and repeated call to `attribute`

Open jonasricker opened this issue 3 years ago • 2 comments

🐛 Bug

When LRP is used with a custom propagation rule, calling attribute more than a single time fails because the custom rule is removed in https://github.com/pytorch/captum/blob/5543b4a8d256f2fa6d43eb3ac8675235f996c9fe/captum/attr/_core/lrp.py#L382

To Reproduce

Steps to reproduce the behavior:

  1. Define a model for which no default rule exists and add a custom rule.
  2. Set up LRP with the model.
  3. Call attribute twice.

Expected behavior

I expect that only those rules which were added by captum are removed with custom rules remaining.

Environment

Describe the environment used for Captum

  • Captum / PyTorch Version (e.g., 1.0 / 0.4.0): 0.5.0
  • OS (e.g., Linux): Linux
  • How you installed Captum / PyTorch (conda, pip, source): pip
  • Build command you used (if compiling from source):
  • Python version: 3.9.5
  • CUDA/cuDNN version: 11.4
  • GPU models and configuration: 4 RTX 3090
  • Any other relevant information:

Additional context

jonasricker avatar Nov 04 '22 09:11 jonasricker

@jonasricker, yes, we do indeed remove it because we don't distinguish between the rules that we set and the ones that the user sets. We can definitely add a quick fix for that. We will remove the ones that we set internally.

The only problem is that if the custom rule updates any state variables and it gets reused, the user needs to be responsible cleaning those state variables before calling attribute again.

https://github.com/pytorch/captum/blob/master/captum/attr/_core/lrp.py#L395

NarineK avatar Nov 10 '22 23:11 NarineK