mtcnn_facenet_cpp_tensorRT icon indicating copy to clipboard operation
mtcnn_facenet_cpp_tensorRT copied to clipboard

Error loading model with python

Open elisabetta496 opened this issue 4 years ago • 11 comments

Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:

with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())

I get the following error:

[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

Do you know how could I solve this issue? Thanks

elisabetta496 avatar Feb 26 '20 17:02 elisabetta496

hi, i'm happy to hear that. As stated in the README, this is a custom plugin written by https://github.com/r7vme/tensorrt_l2norm_helper. Please check how to load custom TensorRT plugins in Python and let me know if you need more help.

nwesem avatar Feb 26 '20 23:02 nwesem

Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:

with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())

I get the following error:

[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.

Do you know how could I solve this issue? Thanks

Hi! I want to use uff models in my python script, but I don't know how to load them correctly. Have you solved the problem of loading models in python?

deaffella avatar Mar 24 '20 04:03 deaffella

does this part of TensorRT documentation help you?

nwesem avatar Mar 24 '20 07:03 nwesem

does this part of TensorRT documentation help you?

No, it does not. I don't see any instructions to load .uff and .engine models in this documentation.

deaffella avatar Mar 25 '20 00:03 deaffella

No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.

elisabetta496 avatar Mar 31 '20 19:03 elisabetta496

No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.

Unfortunately I couldn't find any example of loading models in Python. Please help me

deaffella avatar Apr 04 '20 14:04 deaffella

Hi @elisabetta496! I have the same problem as @deaffella . Could you please share with us how to convert and save facenet.uff to facenet.engine using Python? Many thanks.

do-van-long avatar May 12 '20 16:05 do-van-long

does this part of TensorRT documentation help you?

hi! I'm still trying to open optimized models in Python. could you help me with advice on how to create a. so file for the plugin?

deaffella avatar Aug 04 '20 15:08 deaffella

I'm not sure if this is the way it works. Are you @deaffella ? If so, I could help you compile the project as a dynamic or static library.

nwesem avatar Sep 18 '20 21:09 nwesem

First note this quote from the official TensorRT Release Notes:

Deprecation of Caffe Parser and UFF Parser - We are deprecating Caffe Parser and UFF Parser in TensorRT 7. They will be tested and functional in the next major release of TensorRT 8, but we plan to remove the support in the subsequent major release. Plan to migrate your workflow to use tf2onnx, keras2onnx or TensorFlow-TensorRT (TF-TRT) for deployment.

I have successfully converted Facenet to TRT engine using ONNX. And used it with Python

I downloaded facenet_keras.h5 provided in this tutorial. Then I convert it to TRT engine with ONNX using this Python tutorial provided by Nvidia

AnasMK avatar Oct 01 '20 16:10 AnasMK

Check this repo riotu-lab/tf2trt_with_onnx to covert Facenet model to TensorRT engine and use it with Python

AnasMK avatar Oct 08 '20 20:10 AnasMK