keras_to_tensorflow
keras_to_tensorflow copied to clipboard
How can I used pb file for Tensorflow serving?
Thanks for writing this helpful tool! Question, not issue...
I converted my saved Keras model (.h5 file) to a frozen Tensorflow model (.pb file) using your tool.
Now I want use the .pb file in a Docker container server (Tensorflow serving). Tutorials I have seen say that a folder named variables is needed along with the .pb file. However, in the readme for your tool, I see that all variables are converted to constants. So does that mean that the variables file is not needed?
Sorry for the basic question... I am very new to this! Thanks in advance for your help.
From what I remember, there were different approaches to save a model, one of which is using the SavedModel api of tensorflow. In this case, a variables directory will be created as part of it being exported. You can find the details here: https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/saved_model/README.md
However, there is another approach, which is what I am using here, would be to freeze the variables and export everything into a single file. Hint: You cannot load this file using the SavedModel api.
Thanks for the quick reply! What I am trying to do is set up a Docker container with a model that is doing Tensorflow serving. I have never done this before, so I struggling to figure it out. I think I need to add a function at the beginning of the model to feed data to the inference model. The example I have seen in the Tensorflow documentation uses the SavedModelBuilder to add that function. Is there another api you have used as alternative? Have you used your method with a Docker container server?
Any additional hints would be welcome! Thanks!
I have the same problem as you , have you solved the problem?