Tensorflow lite INT8 Convert Issue
1. System information
Ubuntu 18.04 pip3 install tensorflow-gpu==2.2.0 pip3 install tensorflow-addons==0.10.0 onnx == 1.7.0
onnx model link (https://github.com/onnx/models/blob/master/vision/classification/resnet/model/resnet50-v1-7.onnx)
Conversion works well, but errors occur when performing.
2. Code
onnx -> pb
onnx_model = onnx.load("./onnx_resnet/resnet50-v1-7.onnx") tf_model_path = "./onnx_resnet/tf_2.2.0/tflite/saved_model" tf_rep = prepare(onnx_model) tf_rep.export_graph(tf_model_path)
pb -> tflite
def representative_data_gen():
a = []
datapath = "/MSCOCO/images/test2017"
file_list = os.listdir(datapath)
pixel_mean = (0.485, 0.456, 0.406)
pixel_std = (0.229, 0.224, 0.225)
for i in range(100):
file_name = file_list[i]
img = cv2.imread(os.path.join(datapath, file_list[i]))
img = cv2.resize(img, (224, 224))
img = (img - pixel_mean) / pixel_std
img = img.astype(np.float32)
a.append(img)
a = np.array(a)
img = tf.data.Dataset.from_tensor_slices(a).batch(1)
for i in img.take(1):
yield [i]
tf_model_path = "./onnx_resnet/tf_2.2.0/tflite/saved_model" tflite_model_path = "./tensorflow_resnet/tf.2.2.0/tflite/resnet_int8.tflite"
converter = tf.lite.TFLiteConverter.from_saved_model(tf_model_path) converter.target_spec.supported_ops = [tf.lite.OpsSet.TFLITE_BUILTINS_INT8] converter.representative_dataset=representative_data_gen tflite_model = converter.convert()
with open(tflite_model_path, 'wb') as f: f.write(tflite_model)
3. Failure after conversion
TFlite Test Code
interpreter = tf.lite.Interpreter(model_path="./tensorflow_resnet/tf.2.2.0/tflite/resnet_int8.tflite") interpreter.allocate_tensors()
Error Log
RuntimeError: tensorflow/lite/kernels/pad.cc:106 op_context.input->type != op_context.constant_values->type (9 != 1)Node number 7 (PADV2) failed to prepare.
@JunHyunjae Could you please try using latest stable version of TF 2.6.0 and let us know if the issue still persists ? Thank you!
@sushreebarsa Should I do it only on TF 2.6.0? Because of Chip Dependency, only tflite 2.4.x can be used. Is tf 2.6.0 compatible with tflite 2.4.x?
Could you post the tflite model -- it appears pad was not converted with correct type constraints.
Could you also try adding the flag
converter.experimental_new_quantizer = False
Thanks