tao-toolkit-triton-apps
tao-toolkit-triton-apps copied to clipboard
could not find plugin: BatchTilePlugin_TRT for SSD model
Hi, I generated the model.plan engine file on the same server as Triton. I also built TensorRT OSS but I get the following error when loading the engine:
E0226 17:02:00.421746 1 logging.cc:43] 3: getPluginCreator could not find plugin: BatchTilePlugin_TRT version: 1
E0226 17:02:00.421934 1 logging.cc:43] 1: [pluginV2Runner.cpp::load::291] Error Code 1: Serialization (Serialization assertion creator failed.Cannot deserialize plugin since corresponding IPluginCreator not found in Plugin Registry)
E0226 17:02:00.421948 1 logging.cc:43] 4: [runtime.cpp::deserializeCudaEngine::75] Error Code 4: Internal Error (Engine deserialization failed.)
E0226 17:02:00.425413 1 model_repository_manager.cc:1215] failed to load 'tao_ssd' version 1: Internal: unable to create TensorRT engine
How did you generate the model.plan engine file ? Can you elaborate the steps?
How did you generate the model.plan engine file ? Can you elaborate the steps?
I used export task on TAO container and generated .etlt file. Then the model.plan file is generated according to the steps mentioned in these links:
https://docs.nvidia.com/tao/tao-toolkit/text/object_detection/ssd.html#tensorrt-open-source-software-oss
https://docs.nvidia.com/tao/tao-toolkit/text/object_detection/ssd.html#generating-an-engine-using-tao-converter
@morganh-nv Hi, I think the documentation to generate trt plugins is incorrect and the commands do not generate the desired plugins.
@morganh-nv Hi, please guide me in this regard.
Refer to https://forums.developer.nvidia.com/t/tao-retinanet-triton-server-deployment/215092 to check if it helps you.