GradientExplainer only works on copies of models?
I am trying to use shap with an LSTM and data with 4 time periods so my datasets typically look like (num_rows,4,num_features). Because DeepExplainer is not compatible with Tensorflow 2.x, I have been using GradientExplainer but ran into this bizarre problem where calling the Explainer with my model gives me a warning but calling it on a copy of the model works properly. clf = Sequential() clf.add(Bidirectional(LSTM(layer_size, activation='tanh', return_sequences=True))) clf.add(Bidirectional(LSTM(layer_size, activation='tanh'))) clf.add(Dense(2, activation = 'softmax')) clf.compile(loss="categorical_crossentropy", optimizer = tfa.optimizers.AdamW(weight_decay=.01), metrics=['accuracy']) y_cat = tf.keras.utils.to_categorical(y, num_classes=2) early_stopping_monitor = EarlyStopping(min_delta = 0.01, monitor='loss',patience=10) clf.fit(X,y_cat, epochs=200, callbacks=[early_stopping_monitor]) e = shap.GradientExplainer(clf, X)
If I then later call: shap_values, shap_indexes = e.shap_values(rep) I get this warning: WARNING:tensorflow:Layers in a Sequential model should only have a single input tensor, but we receive a <class 'list'> input: [<tf.Tensor: shape=(138, 4, 187)...
I do not get the warning if I change the final line in the first code segment to: e = shap.GradientExplainer(copy.deepcopy(clf), X) Has anyone else run into this? Is shap thread-safe or is there something else going on? This is the only way I have got GradientExplainer to work without warnings that worry me,
Thanks for reporting the issue. Could you please provide a minimal reproducible example for this? So, specifying how to get X and y in your code or using a dummy dataset?
Otherwise it is hard for us to look deeper into this.
This issue has been inactive for two years, so it's been automatically marked as 'stale'.
We value your input! If this issue is still relevant, please leave a comment below. This will remove the 'stale' label and keep it open.
If there's no activity in the next 90 days the issue will be closed.