onnx-tensorrt icon indicating copy to clipboard operation
onnx-tensorrt copied to clipboard

saving trt using python backend

Open Sheepy0 opened this issue 4 years ago • 1 comments

Hello, I am trying to create a trt model using the Python Backend.

I am able to create the engine using the code below `import onnx import onnx_tensorrt.backend as backend import numpy as np

model = onnx.load("/path/to/model.onnx") engine = backend.prepare(model, device='CUDA:0')`

How do I save the trt model from here ?

Many thanks

Sheepy0 avatar Apr 23 '21 00:04 Sheepy0

I would recommend using the following tools trtexec or polygraphy for finer control of using TRT. The python backend in this repo is not as robust as these tools.

kevinch-nv avatar May 03 '21 18:05 kevinch-nv