alibi icon indicating copy to clipboard operation
alibi copied to clipboard

Integrated Gradient for Multi-Task Models

Open senjed opened this issue 4 years ago • 15 comments

How can the integrated gradient explainer be used for a model with multiple outputs (e.g. a multi-task classification model where each output can have few possible values)? Can you please provide a working example? How can one choose to get attributions with respect to each of the outputs?

senjed avatar Feb 06 '21 23:02 senjed

At the moment there's no option to select an output within the IntegratedGradient class, but If you have a keras or tf.keras model with multiple outputs, you can do something like that

from tensorflow.keras.models import Model 

explanations = []
for out in model.output:
    model_2 = Model(inputs=model.input, outputs=out)
    ig  = IntegratedGradients(model_2)
    explanation = ig.explain(X)
    explanations.append(explanation)

It's good to keep this issue open to understand whether it is convenient to add an option that allow to pass an output as a parameter.

gipster avatar Feb 08 '21 14:02 gipster

I am getting a Graph disconnected error when I try to do this.

senjed avatar Feb 08 '21 17:02 senjed

@senjed could you provide a summary view of what your model looks like (i.e. using the summary() method) for the original and the new model?

jklaise avatar Feb 18 '21 11:02 jklaise

We should have an example of this.

jklaise avatar Jul 14 '21 16:07 jklaise

Hello; I have the same issue: I need to apply the Integrated gradients to a multi-task model with two inputs and two outputs. but I'm getting a Graph disconnected error when I try to create a loop .: explanations = [] out= new_model.output[0]: inp=new_model.inputs[0]: model_task = Model(inp, out) ig = IntegratedGradients(model_task) explanation = ig.explain(Z) explanations.append(explanation)

nouna99 avatar Sep 22 '21 08:09 nouna99

@nouna99 could you provide a short, self-contained example of the multi-task model with the error? This will help us debug the issue.

jklaise avatar Sep 22 '21 08:09 jklaise

new_model=tf.keras.models.load_model(my_wd, custom_objects={'f1_score': f1_score, 'precision' : precision, 'recall' : recall, 'AdamWeightDecay' : optimizer}) new_model.summary() image

from tensorflow.keras.models import Model predictions = pred_class explanations = [] for out in new_model.output: for inp in new_model.inputs: model_task = Model(inputs=inp, outputs=out) ig = IntegratedGradients(model_task) explanation = ig.explain(X_test_sample) explanations.append(explanation)


ValueError Traceback (most recent call last) in () 5 for out in new_model.output: 6 for inp in new_model.inputs: ----> 7 model_task = Model(inputs=inp, outputs=out) 8 ig = IntegratedGradients(model_task) 9 explanation = ig.explain(X_test_sample)

4 frames /usr/local/lib/python3.7/dist-packages/keras/engine/functional.py in _map_graph_network(inputs, outputs) 982 'The following previous layers ' 983 'were accessed without issue: ' + --> 984 str(layers_with_complete_input)) 985 for x in tf.nest.flatten(node.outputs): 986 computable_tensors.add(id(x))

ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 114), dtype=tf.int32, name='input_word_ids2'), name='input_word_ids2', description="created by layer 'input_word_ids2'") at layer "keras_layer". The following previous layers were accessed without issue: []

nouna99 avatar Sep 22 '21 08:09 nouna99

@nouna99 is it possible to share the model architecture, how it was constructed?

jklaise avatar Sep 22 '21 09:09 jklaise

Hello; can i share it in private with you please? Regards

nouna99 avatar Sep 22 '21 12:09 nouna99

Hi @nouna99 . Probably there's no need to share it in private. Would you mind share a minimal piece of code that reproduce the issue and that can be run as it is? It would be very helpful.

gipster avatar Sep 22 '21 13:09 gipster

OK i will share it. I have a question about the actual version of IG : does it support multiple input and multiple output models?

nouna99 avatar Sep 22 '21 13:09 nouna99

At the moment it does support multiple inputs. The comment below suggests a workaround that might work for multiple outputs, but multiple outputs models are not natively supported at the moment.

At the moment there's no option to select an output within the IntegratedGradient class, but If you have a keras or tf.keras model with multiple outputs, you can do something like that

from tensorflow.keras.models import Model 

explanations = []
for out in model.output:
    model_2 = Model(inputs=model.input, outputs=out)
    ig  = IntegratedGradients(model_2)
    explanation = ig.explain(X)
    explanations.append(explanation)

It's good to keep this issue open to understand whether it is convenient to add an option that allow to pass an output as a parameter.

gipster avatar Sep 22 '21 13:09 gipster

Ok, thank you for your feedback. in this case what is the parameter target to pass to the explain function. Because in my case I have two outputs.

nouna99 avatar Sep 22 '21 14:09 nouna99

Since multiple outputs models are not supported at the moment, there's no target parameter to pass in case of multiple outputs. If the workaround I suggested (which is a workaround and might or might not work) does not work, please provide some minimal code that reproduces the issue so we can have a look at it and help you.

At the moment it does support multiple inputs. The comment below suggests a workaround that might work for multiple outputs, but multiple outputs models are not natively supported at the moment.

gipster avatar Sep 22 '21 14:09 gipster

Hello; After using the workaround that you proposed, i get this error

X_test=[X1_test,X2_test] nb_samples = 10 X_test_sample = X_test[:nb_samples] Z1=np.asarray(X_test_sample[0]['input_word_ids']) Z2=np.asarray(X_test_sample[1]['input_word_ids']) Z=[Z1,Z2]

from tensorflow.keras.models import Model predictions = pred_class explanations = [] for out in new_model.output: model_task = Model(inputs=new_model.input, outputs=out) ig = IntegratedGradients(model_task) explanation = ig.explain(Z) explanations.append(explanation)

The error WARNING:tensorflow:The dtype of the watched tensor must be floating (e.g. tf.float32), got tf.int32 WARNING:tensorflow:The dtype of the watched tensor must be floating (e.g. tf.float32), got tf.int32 WARNING:tensorflow:The dtype of the watched tensor must be floating (e.g. tf.float32), got tf.int32 WARNING:tensorflow:The dtype of the watched tensor must be floating (e.g. tf.float32), got tf.int32

AssertionError Traceback (most recent call last) in () 9 model_task = Model(inputs=new_model.input, outputs=out) 10 ig = IntegratedGradients(model_task) ---> 11 explanation = ig.explain(Z) 12 explanations.append(explanation)

6 frames /usr/local/lib/python3.7/dist-packages/keras/engine/functional.py in _run_internal_graph(self, inputs, training, mask) 557 for x in self.outputs: 558 x_id = str(id(x)) --> 559 assert x_id in tensor_dict, 'Could not compute output ' + str(x) 560 output_tensors.append(tensor_dict[x_id].pop()) 561

AssertionError: Could not compute output KerasTensor(type_spec=TensorSpec(shape=(None, 6), dtype=tf.float32, name=None), name='dense_2/Softmax:0', description="created by layer 'dense_2'")

My model is below :

image

nouna99 avatar Sep 22 '21 14:09 nouna99