Save and restore transformer model
I trained a model on Jupyter Notebook using the transformer tutorial. However, I want to save the model and work on it more on my own project.
However, I encountered an issue in saving.
It fails to save the model using transformer.save, giving me this error:
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory.
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
[<ipython-input-96-b268ce7d2a7f>](https://localhost:8080/#) in <module>
7 # transformrr.summary()
8 # type(transformrr)
----> 9 new_model = tf.keras.models.load_model('transformer')
10
11 type(transformer)
1 frames
[/usr/local/lib/python3.8/dist-packages/keras/saving/legacy/saved_model/load.py](https://localhost:8080/#) in _read_legacy_metadata(object_graph_def, metadata, path)
221 ):
222 if not proto.user_object.metadata:
--> 223 raise ValueError(
224 "Unable to create a Keras model from SavedModel at "
225 f"{path}. This SavedModel was exported with "
ValueError: Unable to create a Keras model from SavedModel at transformer. This SavedModel was exported with `tf.saved_model.save`, and lacks the Keras metadata file. Please save your Keras model by calling `model.save` or `tf.keras.models.save_model`. Note that you can still load this SavedModel with `tf.saved_model.load`.
Ok sure, read that it was hard to serialize.
Then, I tried tf.saved_models.save
That saves and loads my models fine, but now my model is in SavedModel form.
I tried using it again for the translator but it gives this error:
ValueError Traceback (most recent call last)
3 frames /usr/local/lib/python3.8/dist-packages/tensorflow/python/saved_model/function_deserialization.py in restored_function_body(*args, **kwargs) 293 "Option {}:\n {}\n Keyword arguments: {}".format( 294 index + 1, _pretty_format_positional(positional), keyword)) --> 295 raise ValueError( 296 "Could not find matching concrete function to call loaded from the " 297 f"SavedModel. Got:\n {_pretty_format_positional(args)}\n Keyword "
ValueError: Could not find matching concrete function to call loaded from the SavedModel. Got: Positional arguments (1 total): * [<tf.Tensor 'inputs:0' shape=(1, 11) dtype=int64>, <tf.Tensor 'inputs_1:0' shape=(1, 1) dtype=int64>] Keyword arguments: {'training': False}
Expected these arguments to match one of the following 4 option(s):
Option 1: Positional arguments (1 total): * (TensorSpec(shape=(None, None), dtype=tf.int64, name='input_1'), TensorSpec(shape=(None, None), dtype=tf.int64, name='input_2')) Keyword arguments: {'training': False}
Option 2: Positional arguments (1 total): * (TensorSpec(shape=(None, None), dtype=tf.int64, name='inputs_0'), TensorSpec(shape=(None, None), dtype=tf.int64, name='inputs_1')) Keyword arguments: {'training': False}
Option 3: Positional arguments (1 total): * (TensorSpec(shape=(None, None), dtype=tf.int64, name='inputs_0'), TensorSpec(shape=(None, None), dtype=tf.int64, name='inputs_1')) Keyword arguments: {'training': True}
Option 4: Positional arguments (1 total): * (TensorSpec(shape=(None, None), dtype=tf.int64, name='input_1'), TensorSpec(shape=(None, None), dtype=tf.int64, name='input_2')) Keyword arguments: {'training': True}
I don't know where this is coming from, and I don't really know how to fix it.
I would appreciate it if someone can help me figure out how to properly save the model that keeps it in keras form, or guide me in figuring out how I can use the SavedModel form with the translator to make the translation work.
Oh, I also read about save_weights(). I tried it, but it keeps on giving me errors. If that's an easier solution, please guide me on how to do it correctly :')
Can you ask this question on the TensorFlow forum? They have more people with a deeper familiarity with Colab & SavedModel. Thanks.