TensorRT icon indicating copy to clipboard operation
TensorRT copied to clipboard

Converting engine file from onnx file with ReduceMax failure of TensorRT 8.5.10 when running trtexec on GPU Orin

Open JYS997760473 opened this issue 9 months ago • 7 comments

Description

I tried to generate engine file from onnx file on Orin GPU, but it failed: [05/15/2024-11:45:16] [I] [TRT] [MemUsageChange] TensorRT-managed allocation in building engine: CPU +0, GPU +4, now: CPU 0, GPU 4 (MiB) [05/15/2024-11:45:16] [E] Saving engine to file failed. [05/15/2024-11:45:16] [E] Engine set up failed

Environment

TensorRT Version:

NVIDIA GPU:

NVIDIA Driver Version:

CUDA Version:

CUDNN Version:

Operating System:

Python Version (if applicable):

Tensorflow Version (if applicable):

PyTorch Version (if applicable):

Baremetal or Container (if so, version):

Relevant Files

Model link:

Steps To Reproduce

Commands or scripts:

Have you tried the latest release?:

Can this model run on other frameworks? For example run ONNX model with ONNXRuntime (polygraphy run <model.onnx> --onnxrt):

JYS997760473 avatar May 15 '24 05:05 JYS997760473