mmdeploy
mmdeploy copied to clipboard
[Bug] Could not load the library of tensorrt plugins. & Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
Checklist
- [X] I have searched related issues but cannot get the expected help.
- [X] 2. I have read the FAQ documentation but cannot get the expected help.
- [ ] 3. The bug has not been fixed in the latest version.
Describe the bug
04/23 11:49:15 - mmengine - WARNING - Could not load the library of tensorrt plugins. Because the file does not exist: [04/23/2024-11:49:17] [TRT] [E] 1: [pluginV2Runner.cpp::load::290] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry) [04/23/2024-11:49:17] [TRT] [E] 4: [runtime.cpp::deserializeCudaEngine::50] Error Code 4: Internal Error (Engine deserialization failed.)
I can use deploy to convert the model to ONNX and TENSORRT normally, but when running the TENSORRT model, there will be an error message, and I don't know why.
Reproduction
import mmengine from mmdeploy.backend.tensorrt import TRTWrapper import tensorrt as trt import torch
engine_file = './work_dirs/rtmdet/end2end.engine' model = TRTWrapper(engine_file)
Environment
cuda11.3,tensorrt8.2.3.0,rtx3090
# in bashrc
export PATH=/home/home_node7/pxs/Tensorrt/TensorRT-8.2.3.0:$PATH
export LD_LIBRARY_PATH=/home/home_node7/pxs/Tensorrt/TensorRT-8.2.3.0/lib:$LD_LIBRARY_PATH
Error traceback
No response
I got the same problem. Have you solved that?
I got the same problem. Have you solved that?
I reconfigured Tensorrt and resolved this issue