TensorRT-CenterNet icon indicating copy to clipboard operation
TensorRT-CenterNet copied to clipboard

converted tensorrt engine causes Segmentation fault

Open WIll-Xu35 opened this issue 4 years ago • 3 comments

Hi all,

I've trained my own ctdet_dla_34 model, with 10 objects to detect. The training was success and torch inference was fine.

I followed the instruction in this repo to generate onnx model. But the generated onnx model cannot be used and raises the following error: onnxruntime.capi.onnxruntime_pybind11_state.InvalidGraph: [ONNXRuntimeError] : 10 : INVALID_GRAPH : This is an invalid model. Error in Node: : No Op registered for DCNv2 with domain_version of 9

I tried to ignore this error and went on to convert it to TensorRT engine, and the engine build finished without errors. But when I tried to load the engine using the following code: with open('test.engine', 'rb') as f, trt.Runtime(TRT_LOGGER) as runtime: engine = runtime.deserialize_cuda_engine(f.read()) it raises the following error: Segmentation fault (core dumped) And this is the only output from the execution.

My environment is pytorch 1.0, ubuntu 1604, TensorRT 5.0.2, onnx-tensorrt v5.0, cuda 9.0, and all this is constructed inside a docker container.

Any idea what might be wrong and how to solve this problem?

Much appreciated.

WIll-Xu35 avatar Apr 30 '20 06:04 WIll-Xu35

@WIll-Xu35 Hi,did you solve it?

Jumponthemoon avatar Jun 24 '20 11:06 Jumponthemoon

@qianchenghao Nope, I used dlav0 instead

WIll-Xu35 avatar Jun 24 '20 14:06 WIll-Xu35

@qianchenghao Nope, I used dlav0 instead

Ok,i‘ll give a try.Thanks!

Jumponthemoon avatar Jun 27 '20 08:06 Jumponthemoon