hub
hub copied to clipboard
hub.KerasLayer call function breaks due to missing `training` argument when using a signature
When loading a SavedModel into a hub.KerasLayer with trainable=False and requesting a specific signature via the signature=... argument, the layer can't be called for inferencing. Instead, the following error appears:
TypeError: signature_wrapper(*, inputs) got unexpected keyword arguments: training
The expected behavior is not to require training to be passed in the signature call if the loaded model/signature isn't intended for fine tuning (i.e. when passing trainable=False, which according to the docs is required when passing a signature).
A workaround is to subclass hub.KerasLayer and manually set the _has_training_argument field to False in the call function.
Reproduciton in Colab: https://colab.research.google.com/drive/1m17ePeWDnQVw-oS_iC4rklhnXJnlQ9KS?usp=sharing
tensorflow 2.41
tensorflow_hub 0.11.0
Probable duplicate :- link.
Currently, by design, this is part of the api between the logic of hub.KerasLayer and the api of the SavedModel that it loads. The API assumes the presence of the training= argument in order to facilitate publishers of the SavedModel to think through both training=True and training=False cases. hub.KerasLayer was targeted towards reusable models (models used as building blocks to train new models, e.g. encoders that needs fine-tuning).
If the SavedModel is load just inference, it might be easier to just load it through hub.load() call.
@edend10 , please confirm on the comment from @akhorlin and suggest whether this can be closed.Thanks.
Thanks for the replies @akhorlin and @arghyaganguly .
My use case is to use a pretrained SavedModel as a layer in another model, but with no fine tuning. hub.load doesn't work in this case I believe.
In any case, I get the reasoning why the training= makes sense for KerasLayer. The first thing I tried actually was to include training= in my exported signature. However, I couldn't find a working example out there of exporting a signature that also accepts training=. We can close this issue if this is the intended behavior, but curious if you have any examples of how to save a model using model.save with a signature that takes in both tensors, and the training= argument.
Thanks!
This page might help. It covers exporting a SavedModel compatible with hub.KerasLayer. It also has links to TF Model Garden code base that is one of the major sources of models for tfhub.dev and has further examples of export of SavedModesl for reuse.
Thanks! Yes I've seen that page and the model garden repository.
The page indicates that
Saving from a Keras Model should make all the mechanics of fine-tuning work (saving weight regularization losses, declaring trainable variables, tracing __call__ for both training=True and training=False, etc.)
However, it does not give an example of how to do this. I rummaged around the model garden codebase and couldn't find an example of exporting a signature with training either. It would be nice to be able to see a concrete example somewhere out there.
When loading a
SavedModelinto ahub.KerasLayerwithtrainable=Falseand requesting a specific signature via thesignature=...argument, the layer can't be called for inferencing. Instead, the following error appears:TypeError: signature_wrapper(*, inputs) got unexpected keyword arguments: trainingThe expected behavior is not to require
trainingto be passed in the signature call if the loaded model/signature isn't intended for fine tuning (i.e. when passingtrainable=False, which according to the docs is required when passing a signature).A workaround is to subclass
hub.KerasLayerand manually set the_has_training_argumentfield toFalsein thecallfunction.Reproduciton in Colab: https://colab.research.google.com/drive/1m17ePeWDnQVw-oS_iC4rklhnXJnlQ9KS?usp=sharing
tensorflow 2.41 tensorflow_hub 0.11.0
We can manually set the _has_training_argument to False, no need to subclass hub.KerasLayer. Just like: layer = hub.KerasLayer() layer._has_training_argument = False.