captum
captum copied to clipboard
I discovered what appears to be a forgotten class, called `captum.attr.InputBaselineXGradient`
🐛 Bug
I found a module that was listed in the code and on the API docs site, but there's not public path to use it and no tests were ever written for it:
captum.attr.InputBaselineXGradient # Missing tests
https://github.com/pytorch/captum/blob/master/captum/attr/_core/gradient_shap.py#L292 Listed as part of the website API here: https://github.com/pytorch/captum/blob/master/sphinx/source/gradient_shap.rst
I discovered this module's existence from the Sphinx logs:
WARNING: autodoc: failed to import class 'InputBaselineXGradient' from module 'captum.attr'; the following exception was raised:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 448, in safe_getattr
return getattr(obj, name, *defargs)
AttributeError: module 'captum.attr' has no attribute 'InputBaselineXGradient'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/importer.py", line 110, in import_object
obj = attrgetter(obj, mangled_name)
File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 332, in get_attr
return autodoc_attrgetter(self.env.app, obj, name, *defargs)
File "/usr/local/lib/python3.7/dist-packages/sphinx/ext/autodoc/__init__.py", line 2780, in autodoc_attrgetter
return safe_getattr(obj, name, *defargs)
File "/usr/local/lib/python3.7/dist-packages/sphinx/util/inspect.py", line 464, in safe_getattr
raise AttributeError(name) from exc
AttributeError: InputBaselineXGradient
Edit:
It appears like a decision was made to not make InputBaselineXGradient in https://github.com/pytorch/captum/pull/175 back in 2019, but it's not used anywhere in the Captum code at the moment: https://github.com/pytorch/captum/search?q=InputBaselineXGradient
@ProGamerGov GradientShap is internally using InputBaselineXGradient and we decided not to expose it in the public API. It is indirectly being tested by the GradientShap test cases but I agree that we could add additional tests for it and expose it in the API. To fix the rst issue we could perhaps remove
.. autoclass:: captum.attr.InputBaselineXGradient
:members:
In a separate PR we can expose InputBaselineXGradient. So far we haven't heard requests about using InputBaselineXGradient.
@NarineK I removed InputBaselineXGradient from the rst file in: https://github.com/pytorch/captum/pull/985