serving
serving copied to clipboard
TF Decision Forests model serving with TF Serving docker
Hi,
I have built the TF DF model and I am trying to serve it using Docker, I am using the following commands:
# Saved the model using the command:
model.save(MODEL_SAVE_PATH)
# Docker commands
docker pull tensorflow/serving
docker run -d --name serv_base_img tensorflow/serving
docker cp $PWD/models/my_classifier1 serv_base_img:/models/my_classifier1
docker commit --change "ENV MODEL_NAME my_classifier1" serv_base_img my_classifier1
docker run -p 8501:8501 --mount type=bind,source=$PWD/models/my_classifier1,target=/models/my_classifier1 -e MODEL_NAME=my_classifier1 -t tensorflow/serving &
I am getting the following issue:
[1] 76832
2021-06-16 13:03:59.138269: I tensorflow_serving/model_servers/server.cc:89] Building single TensorFlow model file config: model_name: my_classifier1 model_base_path: /models/my_classifier1
2021-06-16 13:03:59.138494: I tensorflow_serving/model_servers/server_core.cc:465] Adding/updating models.
2021-06-16 13:03:59.138511: I tensorflow_serving/model_servers/server_core.cc:591] (Re-)adding model: my_classifier1
2021-06-16 13:03:59.258773: I tensorflow_serving/core/basic_manager.cc:740] Successfully reserved resources to load servable {name: my_classifier1 version: 1}
2021-06-16 13:03:59.258814: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: my_classifier1 version: 1}
2021-06-16 13:03:59.258834: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: my_classifier1 version: 1}
2021-06-16 13:03:59.259636: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:38] Reading SavedModel from: /models/my_classifier1/001
2021-06-16 13:03:59.300033: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:90] Reading meta graph with tags { serve }
2021-06-16 13:03:59.300099: I external/org_tensorflow/tensorflow/cc/saved_model/reader.cc:132] Reading SavedModel debug info (if present) from: /models/my_classifier1/001
2021-06-16 13:03:59.301471: I external/org_tensorflow/tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2021-06-16 13:03:59.351039: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:277] SavedModel load for tags { serve }; Status: fail: Not found: Op type not registered 'SimpleMLCreateModelResource' in binary running on de74cefbb44d. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.. Took 91403 microseconds.
2021-06-16 13:03:59.351122: E tensorflow_serving/util/retrier.cc:37] Loading servable: {name: my_classifier1 version: 1} failed: Not found: Op type not registered 'SimpleMLCreateModelResource' in binary running on de74cefbb44d. Make sure the Op and Kernel are registered in the binary running in this process. Note that if you are loading a saved graph which used ops from tf.contrib, accessing (e.g.) `tf.contrib.resampler` should be done before importing the graph, as contrib ops are lazily registered when the module is first accessed.
Any solution for this? Thank you!!!
related :- #1865
Hi,
Does the model use op SimpleMLCreateModelResource
? Could you check out https://www.tensorflow.org/tfx/serving/custom_op and link the custom op with binary?
Hi, Thank you for your response @minglotus-6. Is there a way to resolve this without having to use custom op and build the project as a binary by pulling the project? Similar to other projects where you just docker pull tensorflow serving and directly utilise it.
https://github.com/tensorflow/serving/pull/1887
https://hub.docker.com/repository/docker/ml6team/tf-serving-tfdf
https://blog.ml6.eu/serving-decision-forests-with-tensorflow-b447ea4fc81c
You've got served. I made a docker image, with a PR for serving underway. Enjoy.
@Vedant-R,
TensorFlow Decision Forests (TF-DF)
is supported natively by TF Serving >=2.11
.
Please refer tensorflow_serving.md for documentation and example on running TF-DF model on TF serving.
Thank you!
Closing this due to inactivity. Please take a look into the answers provided above, feel free to reopen and post your comments(if you still have queries on this). Thank you!