CenterFace
CenterFace copied to clipboard
deserialize_cuda_engine error, i need help
Python==3.6.8 TensorRT=5.1 Ubuntu 1804 Cuda device: GTX 1080Ti
in /prj-tensorrt/centerface.py self.net = runtime.deserialize_cuda_engine(f.read())
python: engine.cpp:1104: bool nvinfer1::rt::Engine::deserialize(const void*, std::size_t, nvinfer1::IGpuAllocator&, nvinfer1::IPluginFactory*): Assertion `size >= bsize && "Mismatch between allocated memory size and expected size of serialized engine."' failed.
hi, Maybe your computer environment doesn't match the author's, because the serial file depends on CUDA、GPU driver、GPU and so on
Hi @oyrq,
Do you know how can I rebuild CUDA engine file with my computer?
Hi @oyrq,
Do you know how can I rebuild CUDA engine file with my computer?
You can't, and you need the source model file (eg. tensor flow model/caffe model/onnx model/...)to rebuild RT engine, or you create a computer environment like the author's
you can install TensorRT. and convert the onnx model by using the trtexec
command and save the Engine.
Hello @austingg,
I installed TensorRT and converted centerface.onnx to centerface.trt. However, I could not run inference using TensorRT. The problem is the input dimension mismatch. I used Netron to view the original centerface.onnx file and I got the input shape of (10x3x32x32). I know that the input shape HxW should be multiple of 32. However, what I don't understand is why the input shape of onnx model is (10x3x32x32)? The inference with OpenCV is OK but not with TensorRT. Do you known what is the problem with the provided centerface.onnx file? For your information, I found another github link with centerface.onnx of input shape 1x3x1056x1920 and I could build and run inference with TensorRT successfully.
inference
Hi, Did you ran with deepstream-sdk? I am having issue with that for centerface.onnx model parsing. Dims are not correct.