Problem with tensorflow serving
when I was creating the tensorflow serving model with frozen PB. I am getting an empty variable folder. could you share the un-frozen graph file?
Please help me on this issue.
Which model are you referring to (v1 or v2)? In the readme I have some directions on how to use the model checkpoints (which are available for mobilenetv1) to generate your own frozen graph.
In my experience using a frozen model from different TensorFlow models can be challenging. Its better to export using your own froze graph from checkpoints
Hi Victordibia,
I am using ssdlitemobilenetv2 model checkpoint. Below I mentioned the code for converting the model checkpoint to serving. But empty variable folder was generated.
import tensorflow as tf
SAVE_PATH = D:/handtracking-master/model-checkpoint/ssdlitemobilenetv2/' MODEL_NAME = 'test' VERSION = 5 SERVE_PATH = './serve/{}/{}'.format(MODEL_NAME, VERSION)
checkpoint = tf.train.latest_checkpoint(SAVE_PATH) print(checkpoint)
tf.reset_default_graph()
with tf.Session() as sess:
saver = tf.train.import_meta_graph(checkpoint + '.meta')
graph = tf.get_default_graph()
sess.run(tf.global_variables_initializer())
inputs = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('image_tensor:0'))
detection_boxes = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_boxes:0'))
detection_scores = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_scores:0'))
detection_classes = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('detection_classes:0'))
num_detections = tf.saved_model.utils.build_tensor_info(graph.get_tensor_by_name('num_detections:0'))
export_path = './savedmodel/3'
builder = tf.saved_model.builder.SavedModelBuilder(export_path)
prediction_signature = (
tf.saved_model.signature_def_utils.build_signature_def(
inputs={'inputs': inputs},
outputs={'output1': detection_boxes,'output2':detection_scores,'output3':detection_classes},
method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME))
builder.add_meta_graph_and_variables(
sess, [tf.saved_model.tag_constants.SERVING],
signature_def_map={
tf.saved_model.signature_constants.DEFAULT_SERVING_SIGNATURE_DEF_KEY:
prediction_signature
},
)
builder.save()