mtcnn_facenet_cpp_tensorRT
mtcnn_facenet_cpp_tensorRT copied to clipboard
Error loading model with python
Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:
with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())
I get the following error:
[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Do you know how could I solve this issue? Thanks
hi, i'm happy to hear that. As stated in the README, this is a custom plugin written by https://github.com/r7vme/tensorrt_l2norm_helper. Please check how to load custom TensorRT plugins in Python and let me know if you need more help.
Hi, I tested your C++ implementation and I would like to implement it in Python. I'm trying to load the engine file but the problem is that the plugin is not found when loading the engine with:
with open(engineFile, "rb") as f, trt.Runtime(G_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read())
I get the following error:
[TensorRT] ERROR: INVALID_ARGUMENT: getPluginCreator could not find plugin L2Norm_Helper_TRT version 1 [TensorRT] ERROR: safeDeserializationUtils.cpp (259) - Serialization Error in load: 0 (Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [TensorRT] ERROR: INVALID_STATE: std::exception [TensorRT] ERROR: INVALID_CONFIG: Deserialize the cuda engine failed.
Do you know how could I solve this issue? Thanks
Hi! I want to use uff models in my python script, but I don't know how to load them correctly. Have you solved the problem of loading models in python?
does this part of TensorRT documentation help you?
does this part of TensorRT documentation help you?
No, it does not. I don't see any instructions to load .uff and .engine models in this documentation.
No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.
No, i'm sorry! I've loaded the engine file so far. Have you solved the problem? Otherwise I will let you know if I will be able.
Unfortunately I couldn't find any example of loading models in Python. Please help me
Hi @elisabetta496! I have the same problem as @deaffella . Could you please share with us how to convert and save facenet.uff to facenet.engine using Python? Many thanks.
does this part of TensorRT documentation help you?
hi! I'm still trying to open optimized models in Python. could you help me with advice on how to create a. so file for the plugin?
I'm not sure if this is the way it works. Are you @deaffella ? If so, I could help you compile the project as a dynamic or static library.
First note this quote from the official TensorRT Release Notes:
Deprecation of Caffe Parser and UFF Parser - We are deprecating Caffe Parser and UFF Parser in TensorRT 7. They will be tested and functional in the next major release of TensorRT 8, but we plan to remove the support in the subsequent major release. Plan to migrate your workflow to use tf2onnx, keras2onnx or TensorFlow-TensorRT (TF-TRT) for deployment.
I have successfully converted Facenet to TRT engine using ONNX. And used it with Python
I downloaded facenet_keras.h5 provided in this tutorial. Then I convert it to TRT engine with ONNX using this Python tutorial provided by Nvidia
Check this repo riotu-lab/tf2trt_with_onnx to covert Facenet model to TensorRT engine and use it with Python