tensorrt icon indicating copy to clipboard operation
tensorrt copied to clipboard

Loading the file to build the model failed

Open AarenWu opened this issue 3 years ago • 2 comments

When I load the model using the tensorRT model file generated by TF-TRT, the console displays the following information: 2022-07-13 16:42:54.914735: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger coreReadArchive.cpp (38) - Serialization Error in verifyHeader: 0 (Version tag does not match) 2022-07-13 16:42:54.915035: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_STATE: std::exception 2022-07-13 16:42:54.915065: E tensorflow/compiler/tf2tensorrt/utils/trt_logger.cc:42] DefaultLogger INVALID_CONFIG: Deserialize the cuda engine failed.

My system environment is as follows: CentOS =7.9.2009 Tython = 3.7.6 Tensorflow = 2.4 Nvidia-Driver=495.29.05 CUDA=11.0 cuDnn=8.0 TensorRT=7.2.1

About the tensorRT model file, I generated it with tf.experimental.tensorrt.Converter(), and I checked the assets folder under the save file path, there are many similar trt-serialized-engine.TRTEngineOp files generated, which were not generated using tf.saved_model.save() before

I am not sure if my tensorRT model file is generated correctly. Maybe, you can provide a method to verify this.

Below is the code of my conversion process: image 企业微信截图_16577045396464

AarenWu avatar Jul 13 '22 09:07 AarenWu

Addition: GPU=Nvidia GA100

AarenWu avatar Jul 13 '22 09:07 AarenWu

HI @AarenWu, TF 2.4 is almost 2 years old. Can you please see if this issue still exists in 2.9 or ToT?

ncomly-nvidia avatar Jul 18 '22 15:07 ncomly-nvidia