captum
captum copied to clipboard
Setting Rule for LRP on different activation function.
❓ Questions and Help
We have a set of listed resources available on the website and FAQ: https://captum.ai/ and https://captum.ai/docs/faq . Feel free to open an issue here on the github or in our discussion forums:
I try to apply LRP, but i see there are some need to set rules. For example my activation function ELU.
TypeError: Module of type <class 'torch.nn.modules.activation.ELU'> has no rule defined and nodefault rule exists for this module type. Please, set a ruleexplicitly for this module and assure that it is appropriatefor this type of layer.
How do i set this rule?
Thanks