TensorRT
TensorRT copied to clipboard
Assertion engine failed
Description
Use trtexec convert an onnx model to trt failed, but no more error information, how to solve it?
[02/20/2024-10:56:21] [E] Error[2]: Assertion engine failed.
[02/20/2024-10:56:21] [E] Error[2]: [refitUtils.cpp::buildRefitEngine::71] Error Code 2: Internal Error (Assertion engine failed. )
[02/20/2024-10:56:21] [E] Engine could not be created from network
[02/20/2024-10:56:21] [E] Building engine failed
[02/20/2024-10:56:21] [E] Failed to create engine from model or file.
[02/20/2024-10:56:21] [E] Engine set up failed
Environment
TensorRT Version: 8.6.1
NVIDIA GPU: A30
NVIDIA Driver Version: 525.85.12
CUDA Version: 12.1
CUDNN Version:
Description
Use trtexec convert an onnx model to trt failed, but no more error information, how to solve it?
[02/20/2024-10:56:21] [E] Error[2]: Assertion engine failed. [02/20/2024-10:56:21] [E] Error[2]: [refitUtils.cpp::buildRefitEngine::71] Error Code 2: Internal Error (Assertion engine failed. ) [02/20/2024-10:56:21] [E] Engine could not be created from network [02/20/2024-10:56:21] [E] Building engine failed [02/20/2024-10:56:21] [E] Failed to create engine from model or file. [02/20/2024-10:56:21] [E] Engine set up failedEnvironment
TensorRT Version: 8.6.1
NVIDIA GPU: A30
NVIDIA Driver Version: 525.85.12
CUDA Version: 12.1
CUDNN Version:
Could you provide a minimal onnx model to repro the issue?
Hi @BowenFu Is there an update on this issue? We are facing a similar issue
Hi @BowenFu Is there an update on this issue? We are facing a similar issue
Could you provide a minimal onnx model for us to reproduce the issue?
My problem has been solved. It was due to insufficient GPU memory. When I changed from FP32 to FP16, the problem no longer occurred. I am curious why it is not reporting OOM.