AttributeError: module 'tensorflow.python.eager.backprop' has no attribute '_record_gradient'
Hi!
I have never used SHAP before. I am using the following keras and tesnorflow version: keras version 2.6.0 tensdorflow version 2.6.0-dev20210418
I load my model (CNN) and my database and when I execute SHAB I have the following error:
AttributeError: module 'tensorflow.python.eager.backprop' has no attribute '_record_gradient'
Here the code: import shap import pdb; pdb.set_trace() #tf.compat.v1.disable_eager_execution() background = xtrain[np.random.choice(xtrain.shape[0], 100, replace=False)] explainer = shap.DeepExplainer(model,background) shap_values = explainer.shap_values(xtest[1:5]) shap.image_plot(shap_values, -xtest[1:5])
Here the error:
shap_values = explainer.shap_values(xtest[1:5]) *** AttributeError: in user code:
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\shap\explainers\_deep\deep_tf.py:243 grad_graph *
out = self.model(shap_rAnD)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\engine\base_layer.py:1005 __call__ **
outputs = call_fn(inputs, *args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\engine\sequential.py:374 call
return super(Sequential, self).call(inputs, training=training, mask=mask)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\engine\functional.py:414 call
return self._run_internal_graph(
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\engine\functional.py:550 _run_internal_graph
outputs = node.layer(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\engine\base_layer.py:1005 __call__
outputs = call_fn(inputs, *args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\keras\layers\convolutional.py:245 call
outputs = self._convolution_op(inputs, self.kernel)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\dispatch.py:206 wrapper
return target(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\nn_ops.py:1131 convolution_v2
return convolution_internal(
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\nn_ops.py:1261 convolution_internal
return op(
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\dispatch.py:206 wrapper
return target(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\deprecation.py:602 new_func
return func(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\deprecation.py:602 new_func
return func(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\nn_ops.py:2000 conv1d
value = array_ops.expand_dims(value, spatial_start_dim)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\dispatch.py:206 wrapper
return target(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\deprecation.py:535 new_func
return func(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\array_ops.py:366 expand_dims
return expand_dims_v2(input, axis, name)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\util\dispatch.py:206 wrapper
return target(*args, **kwargs)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\array_ops.py:436 expand_dims_v2
return gen_array_ops.expand_dims(input, axis, name)
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\ops\gen_array_ops.py:2304 expand_dims
_execute.record_gradient(
C:\Users\zanel\AppData\Local\Programs\Python\Python38\lib\site-packages\shap\explainers\_deep\deep_tf.py:26 custom_record_gradient
out = tf_backprop._record_gradient("shap_"+op_name, inputs, attrs, results)
AttributeError: module 'tensorflow.python.eager.backprop' has no attribute '_record_gradient'
I have already tried tf.compat.v1.disable_eager_execution() but then I have another error.
Could you please help me?
Thank you!
I have the same issue with AttributeError: module 'tensorflow.python.eager.backprop' has no attribute '_record_gradient', has anyone has a solution?
Cheers, XH
Same here, it worked as long as I used the GradientExplainer, then after I changed it didn't work anymore (neither for Gradient- nor DeepExplainer) not because of the Shap values, but the whole model.fit is giving out this error. Simply changing to my Laptop made the Gradient Explainer btw work again, but when I wanted to use the DeepExplainer the issue emerged again.
So for you gus GradientExplainer on different Hardware might work. Besides that, I sadly didn't find anything.
Any news ?
I got the same issue when using the DeepExplainer.
Same issue.
Same issue here. I'm using shap.DeepExplainer to explain a CNN model for MNIST dataclass. It just throws the same error as you showed above.
Same issue for me recently, reverting Tensorflow from 2.6.0 back to 2.5.0 worked for me.
This should be fixed for some time. Can anyone give any feedback if this happens with the latest shap version? No reproducible example is provided here.
This issue has been inactive for two years, so it's been automatically marked as 'stale'.
We value your input! If this issue is still relevant, please leave a comment below. This will remove the 'stale' label and keep it open.
If there's no activity in the next 90 days the issue will be closed.