TensorRT
TensorRT copied to clipboard
Cannot deserialize serialized engine built with EngineCapability::kDLA_STANDALONEfailure of TensorRT 8.5.2 when running yolov5 on Jetson agx orin dk
Description
- I clone the repo https://github.com/NVIDIA-AI-IOT/cuDLA-samples, then using the
trtexecto inference the engine file. The fileyolov5.int8.int8hwc4in.fp16chw16out.standalone.binis generated by following the repo readme.
trtexec --loadEngine=yolov5.int8.int8hwc4in.fp16chw16out.standalone.bin -batch=1 --streams=1
error:
[01/19/2024-16:12:34] [E] Error[9]: Cannot deserialize serialized engine built with EngineCapability::kDLA_STANDALONE, use cuDLA APIs instead.
[01/19/2024-16:12:34] [E] Error[4]: [runtime.cpp::deserializeCudaEngine::65] Error Code 4: Internal Error (Engine deserialization failed.)
Environment
TensorRT Version: 8.5.2
NVIDIA GPU: nvidia jetson agx orin dk
Software part of jetson-stats 4.2.4 - (c) 2024, Raffaello Bonghi Model: Jetson AGX Orin Developer Kit - Jetpack 5.1.2 [L4T 35.4.1] NV Power Mode[0]: MAXN Serial Number: [XXX Show with: jetson_release -s XXX] Hardware:
- P-Number: p3701-0005
- Module: NVIDIA Jetson AGX Orin (64GB ram) Platform:
- Distribution: Ubuntu 20.04 focal
- Release: 5.10.120-tegra jtop:
- Version: 4.2.4
- Service: Active Libraries:
- CUDA: 11.4.315
- cuDNN: 8.6.0.166
- TensorRT: 5.1.2
- VPI: 2.3.9
- Vulkan: 1.3.204
- OpenCV: 4.6.0 - with CUDA: YES
the dla standalone loadable can only be loaded with cuDLA, TRT cannot load it.
Is there any plan that support the dla standalone loadable in TRT ?
AFAIK no.
closing since no activity for more than 3 weeks, thanks all!