tkDNN
tkDNN copied to clipboard
Using Engine from PythonAPI
Hi, I would like to use your yolov4 engine from the PythonAPI of TensorRT, i faced this problem a little bit ago, but i've not been able to resolve it.
trtexec should deserialize your engine, but it's not able to deserialize the engine cause "PluginFactory", there is no implementation of IPluginCreator or IPluginV2, instead of that the "custom layers" extend from IPlugin which is not compatible with the more recent TensorRT versions. That's not a problem if you execute the repository, but if you want to load the engine from TrintonInferenceServer or trtexec (trtexec is a previous step to make an engine compatible with TrintonInferenceServer).
Attaching the output from terminal:
[02/12/2021-19:09:26] [I] Loading supplied plugin library: ./libkernels.so
[02/12/2021-19:09:27] [E] [TRT] deserializationUtils.cpp (635) - Serialization Error in load: 0 (Serialized engine contains plugin, but no plugin factory was provided. To deserialize an engine without a factory, please use IPluginV2 instead.)
[02/12/2021-19:09:27] [E] [TRT] INVALID_STATE: std::exception
[02/12/2021-19:09:27] [E] [TRT] INVALID_CONFIG: Deserialize the cuda engine failed.
[02/12/2021-19:09:27] [E] Engine creation failed
[02/12/2021-19:09:27] [E] Engine set up failed
&&&& FAILED TensorRT.trtexec # trtexec_debug --loadEngine=yolo4_fp16.rt --plugins=./libkernels.so
Were you able to solve this issue? @IsraelLencina
No, i've tried everything, but trtexec does not work with that engine, if you need a yolov4 engine with compatibility with trtexec go to: jkjung-avt
Also i've done a wrapper to this repository to access from python, also to that but it's not compatible with trtexec. Custom layers seems to be a heritage, i've not be able to solve the problem because "PluginFactory" are used to "mount" the cnn
@IsraelLencina Thanks for your reply. But I am currently trying to boost my performance using tkDNN. I already am using tensorrt for my inference.