darkflow icon indicating copy to clipboard operation
darkflow copied to clipboard

Exporting darkflow-YOLO to Tensorflow Serving, but got empty variables

Open sugartom opened this issue 6 years ago • 13 comments

Hi, I was trying to export darkflow-YOLO to Tensorflow Serving, so I can use TF-Serving to run those YOLO models.

But when I used the following script to export darkflow-YOLO, the exported model is almost empty. Specifically, saved_model.pb is only 315 bytes, and directory variables is empty (usually, it should contain two files: variables.data-00000-of-00001 and variables.index)

https://gist.github.com/sugartom/70b58505bf5f28d1cf5d05904f6c0af2

I have used similar code (line 32-58 in above script) to export other models (Inception, caffe-tensorflow's ResNet/VGG), and everything was good. So I would like to ask whether you have any idea why the same code won't apply to darkflow.

Thanks in advance! :-)

sugartom avatar Jun 28 '18 06:06 sugartom

@sugartom Any solutions ?? I am also getting the same issue

Can I use "xyz.index" and "xyz.data-00000-of-00001" created during training yolo for tensorflow serving.

gauravgola96 avatar Feb 05 '19 07:02 gauravgola96

i am also looking to export variables for serving. how can we get that? how we can deploy the model in serving

shivaram93 avatar Jun 07 '19 07:06 shivaram93

def build_model(self):
		with self.sess.graph.as_default():
			x_op = self.sess.graph.get_operation_by_name("input")
			x = x_op.outputs[0]
			pred_op = self.sess.graph.get_operation_by_name("output")
			pred = pred_op.outputs[0]
		with self.sess.graph.as_default():
			prediction_signature = tf.saved_model.signature_def_utils.build_signature_def(
				inputs={
					"input": tf.saved_model.utils.build_tensor_info(x)
				},
				outputs={
					"output": tf.saved_model.utils.build_tensor_info(pred)
				},
				method_name=tf.saved_model.signature_constants.PREDICT_METHOD_NAME
			)
			builder = tf.saved_model.builder.SavedModelBuilder('/home/gaurav/serving/darkflow/2')
			builder.add_meta_graph_and_variables(
				self.sess, [tf.saved_model.tag_constants.SERVING],
				signature_def_map={
					"predict": prediction_signature,
				})
			builder.save()

You can add this code in darkflow>net>bulid.py and it will create tf serving files direclty

gauravgola96 avatar Jun 07 '19 07:06 gauravgola96

@gauravgola96 '/home/gaurav/serving/darkflow/2' - > what specific directory we have to provide Does this create after training ends?

so, we have to deploy with same version of tensorflow and python as same as training and export right. but while training we trained in tensorflow-gpu = 1.9. how about this ?

shivaram93 avatar Jun 07 '19 10:06 shivaram93

@shivaram93 '/home/gaurav/serving/darkflow/2' > this is tf serving file path where it will save You have to call this method after training ends to make tf serving file format. I am not sure about version but tf and python version should ideally be the same.

gauravgola96 avatar Jun 07 '19 11:06 gauravgola96

okay, let me try and let you know. can we able to train locally and deploy the models in google cloud for prediction. does that support prediction and deployment alone?

shivaram93 avatar Jun 07 '19 11:06 shivaram93

@shivaram93 Yes, you can easily train tiny yolov2 on 8GB RAM i7 system Once you will have TF Serving file from build_model() you can deploy and host it on tf serving on Gcloud or anywhere else you want.

gauravgola96 avatar Jun 07 '19 11:06 gauravgola96

@gauravgola96 I am receiving Invalid GraphDef message while trying to serve the file via tf. I have trained a single class tiny-yolov2 model and generated tf serving file as per you instructions above.

anisbhsl avatar Mar 30 '20 09:03 anisbhsl

@anisbhsl checkout the input dimensions of the graph. Might be the reason for invalid graph. Not sure Also follow https://github.com/thtrieu/darkflow/issues/403

gauravgola96 avatar Mar 30 '20 09:03 gauravgola96

@gauravgola96 Thanks for reply. Here's the output from saved_model_cli:

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['input'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 416, 416, 3)
        name: input:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['output'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 13, 13, 30)
        name: output:0
  Method name is: tensorflow/serving/predict

I have resized image according to input dimensions but this error is generated while reading the file.

anisbhsl avatar Mar 30 '20 09:03 anisbhsl

Hey, @gauravgola96 when I am running the script which you gives in your post for exporting the YOLO model into TensorFlow serving format I got the following error!!

AttributeError Traceback (most recent call last) in () 10 export_path = "./export/1/" 11 ---> 12 tfnet.build_model(export_path = export_path) 13 14 with tfnet.sess.graph.as_default():

AttributeError: 'TFNet' object has no attribute 'build_model'

How may I solved this issue?

NayanDharviya avatar Dec 15 '20 12:12 NayanDharviya

Hey, @gauravgola96 when I am running the script which you gives in your post for exporting the YOLO model into TensorFlow serving format I got the following error!!

AttributeError Traceback (most recent call last) in () 10 export_path = "./export/1/" 11 ---> 12 tfnet.build_model(export_path = export_path) 13 14 with tfnet.sess.graph.as_default():

AttributeError: 'TFNet' object has no attribute 'build_model'

How may I solve this issue?

NayanDharviya avatar Dec 15 '20 12:12 NayanDharviya

@NayanDharviya : I did this with TF 1.11 can you try out with my repo

gauravgola96 avatar Dec 15 '20 15:12 gauravgola96