TensorFlow-2.x-YOLOv3 icon indicating copy to clipboard operation
TensorFlow-2.x-YOLOv3 copied to clipboard

problems during inference when using converted tflite model

Open lukqw opened this issue 4 years ago • 1 comments

Hi there, I recently tried to convert your model to tf lite and run inference on it, but am experiencing some errors.

I'm using the following code to convert the model to tf lite:

model = Load_Yolo_model()
model.save("./yolo_model")
model_converter = tf.lite.TFLiteConverter.from_saved_model("./yolo_model")
model_lite = model_converter.convert()
f = open(f"./yolo.tflite", "wb")
f.write(model_lite)
f.close()

this creates both a saved_models folder and the .tflite file, which I should be able to use by doing the following:

model = self.tf.Interpreter(model_path=('./yolo.tflite'))
model.allocate_tensors()

input_details = model.get_input_details()
output_details = model.get_output_details()

input_shape = input_details[0]['shape']
input_data = np.array(np.random.random_sample(input_shape), dtype=np.float32)
model.set_tensor(input_details[0]['index'], input_data)

model.invoke()
model.get_tensor(output_details[0]['index'])

but I am getting the following error message: external/org_tensorflow/tensorflow/lite/kernels/reshape.cc:58 stretch_dim != -1 (0 != -1)Node number 35 (RESHAPE) failed to prepare.

Did someone else run into this issue? I'm not entirely sure about what I am doing wrong

lukqw avatar Oct 18 '21 21:10 lukqw

I managed to solve this problem by converting the model in the following way instead: (taken from here)

batch_size = 1
model = Load_Yolo_model()
model.save("./yolo_model")
input_shape = model.inputs[0].shape.as_list()
input_shape[0] = batch_size
func = tf.function(model).get_concrete_function(tf.TensorSpec(input_shape, model.inputs[0].dtype))
model_converter = tf.lite.TFLiteConverter.from_concrete_functions([func])
model_lite = model_converter.convert()
f = open(f"./yolo_model.tflite", "wb")
f.write(model_lite)
f.close()

lukqw avatar Oct 21 '21 13:10 lukqw