xplique
xplique copied to clipboard
Handle model that are aggregation of models
As hotfix for the issue regarding the order of input tensor, I created a model that is an aggregation of two models. In the first one I added a permute layer to handle the channel order.
The hotfix is working well in forward pass (tested). Unfortunately following error is raised by xplique
. I think aggregation of models is not handled by Xplique.
ValueError Traceback (most recent call last)
/tmp/ipykernel_8110/184323267.py in <module>
7
8 # create an explainer and generate explanations
----> 9 explainer = GradCAM(modper)
10 explanations = explainer(X_preprocessed, Y) # `explainer.explain(inputs, labels)` also works
11
~/.local/lib/python3.7/site-packages/xplique/attributions/grad_cam.py in __init__(self, model, output_layer, batch_size, conv_layer)
41 batch_size: Optional[int] = 32,
42 conv_layer: Optional[Union[str, int]] = None):
---> 43 super().__init__(model, output_layer, batch_size)
44
45 # find the layer to apply grad-cam
~/.local/lib/python3.7/site-packages/xplique/attributions/base.py in __init__(self, model, output_layer, batch_size)
123 # reconfigure the model (e.g skip softmax to target logits)
124 target_layer = find_layer(model, output_layer)
--> 125 model = tf.keras.Model(model.input, target_layer.output)
126
127 # sanity check, output layer before softmax
/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
528 self._self_setattr_tracking = False # pylint: disable=protected-access
529 try:
--> 530 result = method(self, *args, **kwargs)
531 finally:
532 self._self_setattr_tracking = previous_value # pylint: disable=protected-access
/opt/conda/lib/python3.7/site-packages/keras/engine/functional.py in __init__(self, inputs, outputs, name, trainable, **kwargs)
107 generic_utils.validate_kwargs(kwargs, {})
108 super(Functional, self).__init__(name=name, trainable=trainable)
--> 109 self._init_graph_network(inputs, outputs)
110
111 @tf.__internal__.tracking.no_automatic_dependency_tracking
/opt/conda/lib/python3.7/site-packages/tensorflow/python/training/tracking/base.py in _method_wrapper(self, *args, **kwargs)
528 self._self_setattr_tracking = False # pylint: disable=protected-access
529 try:
--> 530 result = method(self, *args, **kwargs)
531 finally:
532 self._self_setattr_tracking = previous_value # pylint: disable=protected-access
/opt/conda/lib/python3.7/site-packages/keras/engine/functional.py in _init_graph_network(self, inputs, outputs)
191 # Keep track of the network's nodes and layers.
192 nodes, nodes_by_depth, layers, _ = _map_graph_network(
--> 193 self.inputs, self.outputs)
194 self._network_nodes = nodes
195 self._nodes_by_depth = nodes_by_depth
/opt/conda/lib/python3.7/site-packages/keras/engine/functional.py in _map_graph_network(inputs, outputs)
982 'The following previous layers '
983 'were accessed without issue: ' +
--> 984 str(layers_with_complete_input))
985 for x in tf.nest.flatten(node.outputs):
986 computable_tensors.add(id(x))
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 3, 512, 612), dtype=tf.float32, name='input.1'), name='input.1', description="created by layer 'input.1'") at layer "87_pad". The following previous layers were accessed without issue: []
Sorry for the delay and I hope it is not too late but can you also provide a minimal code leading to this error ? It is to be able to reproduce it and try to fix it
There are several methods that will not work for not classic model architecture such as aggregated models: GradCAM
, GradCAMPP
, DeconvNet
, and GuidedBackpropagation
.
But you should be able to use all the other methods. For white-box methods, model(inputs)
should be differentiable with respect to inputs
.
Since there is no more activity on this issue and no answers to @AntoninPoche's comment, I will mark this issue resolved. If anyone has the same issue, feel free to re-open it.