hub icon indicating copy to clipboard operation
hub copied to clipboard

hub.KerasLayer call function breaks due to missing `training` argument when using a signature

Open edend10 opened this issue 4 years ago • 7 comments

When loading a SavedModel into a hub.KerasLayer with trainable=False and requesting a specific signature via the signature=... argument, the layer can't be called for inferencing. Instead, the following error appears:

TypeError: signature_wrapper(*, inputs) got unexpected keyword arguments: training

The expected behavior is not to require training to be passed in the signature call if the loaded model/signature isn't intended for fine tuning (i.e. when passing trainable=False, which according to the docs is required when passing a signature).

A workaround is to subclass hub.KerasLayer and manually set the _has_training_argument field to False in the call function.

Reproduciton in Colab: https://colab.research.google.com/drive/1m17ePeWDnQVw-oS_iC4rklhnXJnlQ9KS?usp=sharing

tensorflow 2.41
tensorflow_hub 0.11.0

edend10 avatar Mar 13 '21 01:03 edend10

Probable duplicate :- link.

arghyaganguly avatar Mar 15 '21 03:03 arghyaganguly

Currently, by design, this is part of the api between the logic of hub.KerasLayer and the api of the SavedModel that it loads. The API assumes the presence of the training= argument in order to facilitate publishers of the SavedModel to think through both training=True and training=False cases. hub.KerasLayer was targeted towards reusable models (models used as building blocks to train new models, e.g. encoders that needs fine-tuning).

If the SavedModel is load just inference, it might be easier to just load it through hub.load() call.

akhorlin avatar Mar 15 '21 12:03 akhorlin

@edend10 , please confirm on the comment from @akhorlin and suggest whether this can be closed.Thanks.

arghyaganguly avatar Mar 18 '21 06:03 arghyaganguly

Thanks for the replies @akhorlin and @arghyaganguly .

My use case is to use a pretrained SavedModel as a layer in another model, but with no fine tuning. hub.load doesn't work in this case I believe.

In any case, I get the reasoning why the training= makes sense for KerasLayer. The first thing I tried actually was to include training= in my exported signature. However, I couldn't find a working example out there of exporting a signature that also accepts training=. We can close this issue if this is the intended behavior, but curious if you have any examples of how to save a model using model.save with a signature that takes in both tensors, and the training= argument.

Thanks!

edend10 avatar Mar 19 '21 00:03 edend10

This page might help. It covers exporting a SavedModel compatible with hub.KerasLayer. It also has links to TF Model Garden code base that is one of the major sources of models for tfhub.dev and has further examples of export of SavedModesl for reuse.

akhorlin avatar Mar 20 '21 08:03 akhorlin

Thanks! Yes I've seen that page and the model garden repository.

The page indicates that

Saving from a Keras Model should make all the mechanics of fine-tuning work (saving weight regularization losses, declaring trainable variables, tracing __call__ for both training=True and training=False, etc.)

However, it does not give an example of how to do this. I rummaged around the model garden codebase and couldn't find an example of exporting a signature with training either. It would be nice to be able to see a concrete example somewhere out there.

edend10 avatar Mar 23 '21 04:03 edend10

@edend10

Could you please refer to these examples, and let us know if it helps.Thanks

UsharaniPagadala avatar Nov 02 '21 11:11 UsharaniPagadala

When loading a SavedModel into a hub.KerasLayer with trainable=False and requesting a specific signature via the signature=... argument, the layer can't be called for inferencing. Instead, the following error appears:

TypeError: signature_wrapper(*, inputs) got unexpected keyword arguments: training

The expected behavior is not to require training to be passed in the signature call if the loaded model/signature isn't intended for fine tuning (i.e. when passing trainable=False, which according to the docs is required when passing a signature).

A workaround is to subclass hub.KerasLayer and manually set the _has_training_argument field to False in the call function.

Reproduciton in Colab: https://colab.research.google.com/drive/1m17ePeWDnQVw-oS_iC4rklhnXJnlQ9KS?usp=sharing

tensorflow 2.41
tensorflow_hub 0.11.0

We can manually set the _has_training_argument to False, no need to subclass hub.KerasLayer. Just like: layer = hub.KerasLayer() layer._has_training_argument = False.

MichaleDong avatar Nov 24 '22 05:11 MichaleDong