captum
captum copied to clipboard
LRP fails with custom rule and repeated call to `attribute`
🐛 Bug
When LRP is used with a custom propagation rule, calling attribute more than a single time fails because the custom rule is removed in https://github.com/pytorch/captum/blob/5543b4a8d256f2fa6d43eb3ac8675235f996c9fe/captum/attr/_core/lrp.py#L382
To Reproduce
Steps to reproduce the behavior:
- Define a model for which no default rule exists and add a custom rule.
- Set up LRP with the model.
- Call
attributetwice.
Expected behavior
I expect that only those rules which were added by captum are removed with custom rules remaining.
Environment
Describe the environment used for Captum
- Captum / PyTorch Version (e.g., 1.0 / 0.4.0): 0.5.0
- OS (e.g., Linux): Linux
- How you installed Captum / PyTorch (
conda,pip, source): pip - Build command you used (if compiling from source):
- Python version: 3.9.5
- CUDA/cuDNN version: 11.4
- GPU models and configuration: 4 RTX 3090
- Any other relevant information:
Additional context
@jonasricker, yes, we do indeed remove it because we don't distinguish between the rules that we set and the ones that the user sets. We can definitely add a quick fix for that. We will remove the ones that we set internally.
The only problem is that if the custom rule updates any state variables and it gets reused, the user needs to be responsible cleaning those state variables before calling attribute again.
https://github.com/pytorch/captum/blob/master/captum/attr/_core/lrp.py#L395