keras
keras copied to clipboard
Cannot load back model with no-op Concatenate layer
Please go to TF Forum for help and support:
https://discuss.tensorflow.org/tag/keras
If you open a GitHub issue, here is our policy:
It must be a bug, a feature request, or a significant problem with the documentation (for small docs fixes please send a PR instead). The form below must be filled out.
Here's why we have that policy:.
Keras developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.
System information.
- Have I written custom code (as opposed to using a stock example script provided in Keras): Yes
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS Big Sur 11.6
- TensorFlow installed from (source or binary): binary
- TensorFlow version (use command below): v2.6.0-rc2-32-g919f693420e 2.6.0
- Python version: 3.9.7
- Bazel version (if compiling from source): N/A
- GPU model and memory: N/A
- Exact command to reproduce:
You can collect some of this information using our environment capture script:
https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh
You can obtain the TensorFlow version with: python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"
Describe the problem.
When I create a simple model with a dummy Concatenate layer (i.e. the concatenation receives one single element), I am able to save it successfully, but the subsequent model loading fails.
Describe the current behavior.
Loading a trained model fails.
Describe the expected behavior.
The model loading should finish without errors.
- Do you want to contribute a PR? (yes/no): No
- If yes, please read this page for instructions
- Briefly describe your candidate solution(if contributing): N/A
Standalone code to reproduce the issue.
Provide a reproducible test case that is the bare minimum necessary to generate the problem. If possible, please share a link to Colab/Jupyter/any notebook.
import tensorflow as tf
if __name__ == "__main__":
input_layer = tf.keras.Input(shape=[100])
dense_layer = tf.keras.layers.Dense(1)(input_layer)
concatenate_layer = tf.keras.layers.Concatenate()([dense_layer])
model = tf.keras.Model([input_layer], [concatenate_layer])
model.compile(optimizer="adam", loss="mean_absolute_error")
model.save("model.h5")
loaded_model = tf.keras.models.load_model("model.h5")
Source code / logs.
Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem.
Full traceback:
Traceback (most recent call last):
File "/Users/stefan/workspace/tierra/bug.py", line 10, in <module>
loaded_model = tf.keras.models.load_model("model.h5")
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/saving/save.py", line 200, in load_model
return hdf5_format.load_model_from_hdf5(filepath, custom_objects,
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/saving/hdf5_format.py", line 180, in load_model_from_hdf5
model = model_config_lib.model_from_config(model_config,
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/saving/model_config.py", line 52, in model_from_config
return deserialize(config, custom_objects=custom_objects)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/layers/serialization.py", line 208, in deserialize
return generic_utils.deserialize_keras_object(
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/utils/generic_utils.py", line 674, in deserialize_keras_object
deserialized_obj = cls.from_config(
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/functional.py", line 662, in from_config
input_tensors, output_tensors, created_layers = reconstruct_from_config(
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/functional.py", line 1283, in reconstruct_from_config
process_node(layer, node_data)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/functional.py", line 1231, in process_node
output_tensors = layer(input_tensors, **kwargs)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/base_layer.py", line 976, in __call__
return self._functional_construction_call(inputs, args, kwargs,
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/base_layer.py", line 1114, in _functional_construction_call
outputs = self._keras_tensor_symbolic_call(
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/base_layer.py", line 848, in _keras_tensor_symbolic_call
return self._infer_output_signature(inputs, args, kwargs, input_masks)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/base_layer.py", line 886, in _infer_output_signature
self._maybe_build(inputs)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/engine/base_layer.py", line 2659, in _maybe_build
self.build(input_shapes) # pylint:disable=not-callable
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/utils/tf_utils.py", line 259, in wrapper
output_shape = fn(instance, input_shape)
File "/Users/stefan/workspace/tierra/.env/lib/python3.9/site-packages/keras/layers/merge.py", line 489, in build
raise ValueError('A `Concatenate` layer should be called '
ValueError: A `Concatenate` layer should be called on a list of at least 1 input.
@stefanistrate, In general, layers.concatenate can be used to merge all available features into a single large vector.
Why are using tf.keras.layers.Concatenate() layer? Can you please brief about your use case?
In my case, I control the various layers I want to concatenate via flags to the python script. Although in general I concatenate more than one layer, the baseline is with one single layer and it would be helpful (and expected I would say) that Concatenate() is a no-op for only one input layer.
For functional API you should use concatenate instead of Concatenate, please find the gist here for the same with multiple inputs in concat layer and in your case there is no need of concatenating the layers, it can be used when multiple Input layers are used like below example.
I know there's no need to concatenate 1 layer but, as I was saying, in case I still do, Concatenate() should be a no-op. Same for concatenate(). My point is that instead of having something like:
layers = [...]
if len(layers) == 1:
concatenated = layers[0]
else:
concatenated = tf.keras.layers.concatenate(layers)
I'd like to simply use:
layers = [...]
concatenated = tf.keras.layers.concatenate(layers)
And coming back to the bug I reported: I can do the above at training time, but loading back the model produces the crash.
This issue has been automatically marked as stale because it has no recent activity. It will be closed if no further activity occurs. Thank you.
Closing as stale. Please reopen if you'd like to work on this further.
Triage notes: we took a look and think that we should actually have concatenate throw an error with only a single input. We will try out a change and see how safe that is.
For now, the solution proposed in https://github.com/keras-team/keras/issues/15547#issuecomment-966954665 is probably best.