CenterFace icon indicating copy to clipboard operation
CenterFace copied to clipboard

deserialize_cuda_engine error, i need help

Open Samonsix opened this issue 4 years ago • 7 comments

Python==3.6.8 TensorRT=5.1 Ubuntu 1804 Cuda device: GTX 1080Ti

in /prj-tensorrt/centerface.py self.net = runtime.deserialize_cuda_engine(f.read())

python: engine.cpp:1104: bool nvinfer1::rt::Engine::deserialize(const void*, std::size_t, nvinfer1::IGpuAllocator&, nvinfer1::IPluginFactory*): Assertion `size >= bsize && "Mismatch between allocated memory size and expected size of serialized engine."' failed.

Samonsix avatar Mar 28 '20 03:03 Samonsix

hi, Maybe your computer environment doesn't match the author's, because the serial file depends on CUDA、GPU driver、GPU and so on

onnx20 avatar Mar 28 '20 06:03 onnx20

Hi @oyrq,

Do you know how can I rebuild CUDA engine file with my computer?

do-van-long avatar Apr 12 '20 12:04 do-van-long

Hi @oyrq,

Do you know how can I rebuild CUDA engine file with my computer?

You can't, and you need the source model file (eg. tensor flow model/caffe model/onnx model/...)to rebuild RT engine, or you create a computer environment like the author's

onnx20 avatar Apr 12 '20 13:04 onnx20

you can install TensorRT. and convert the onnx model by using the trtexec command and save the Engine.

austingg avatar Apr 30 '20 07:04 austingg

Hello @austingg,

I installed TensorRT and converted centerface.onnx to centerface.trt. However, I could not run inference using TensorRT. The problem is the input dimension mismatch. I used Netron to view the original centerface.onnx file and I got the input shape of (10x3x32x32). I know that the input shape HxW should be multiple of 32. However, what I don't understand is why the input shape of onnx model is (10x3x32x32)? The inference with OpenCV is OK but not with TensorRT. Do you known what is the problem with the provided centerface.onnx file? For your information, I found another github link with centerface.onnx of input shape 1x3x1056x1920 and I could build and run inference with TensorRT successfully.

do-van-long avatar Apr 30 '20 15:04 do-van-long

inference

Hi, Did you ran with deepstream-sdk? I am having issue with that for centerface.onnx model parsing. Dims are not correct.

rahulsharma11 avatar Feb 22 '21 11:02 rahulsharma11