PaddleSharp icon indicating copy to clipboard operation
PaddleSharp copied to clipboard

[Windows GPU] 启用 PaddleDevice.TensorRt 时报 “PreconditionNotMetError: Please compile with TENSORRT first”

Open EndSmile opened this issue 5 months ago • 5 comments

Describe the bug

问题描述

在 Windows 11(CUDA 12.9 / cuDNN 9.1.0)环境,使用 NuGet 包
Sdcb.PaddleInference.runtime.win64.cu129_cudnn910_sm120 并调用

var device = PaddleDevice.Gpu()
              .And(PaddleDevice.TensorRt("shape.txt"));
using var ocr = new PaddleOcrAll(model, device);

log:

WARNING: Logging before InitGoogleLogging() is written to STDERR
I0621 00:47:19.582204 25144 analysis_config.cc:1475] In CollectShapeInfo mode, we will disable optimizations and collect the shape information of all intermediate tensors in the compute graph and calculate the min_shape, max_shape and opt_shape.


--------------------------------------
C++ Traceback (most recent call last):
--------------------------------------
Not support stack backtrace yet.

----------------------
Error Message Summary:
----------------------
PreconditionNotMetError: To use Paddle-TensorRT, please compile with TENSORRT first. (at D:\a\PaddleSharp\PaddleSharp\paddle-src\paddle\fluid\inference\api\analysis_config.cc:787)

Unhandled exception. System.Runtime.InteropServices.SEHException (0x80004005): External component has thrown an exception.
   at Sdcb.PaddleInference.Native.PaddleNative.PD_ConfigEnableTensorRtEngine(IntPtr pd_config, Int64 workspace_size, Int32 max_batch_size, Int32 min_subgraph_size, PaddlePrecision precision, SByte use_static, SByte use_calib_mode)
   at Sdcb.PaddleInference.PaddleConfig.EnableTensorRtEngine(Int32 workspaceSize, Int32 maxBatchSize, Int32 minSubgraphSize, PaddlePrecision precision, Boolean useStatic, Boolean useCalibMode)
   at Sdcb.PaddleInference.PaddleDevice.<>c__DisplayClass2_0.<TensorRt>b__0(PaddleConfig cfg)
   at Sdcb.PaddleInference.PaddleConfigureExtensions.<>c__DisplayClass0_0.<And>b__0(PaddleConfig cfg)
   at Sdcb.PaddleInference.PaddleConfigureExtensions.<>c__DisplayClass0_0.<And>b__0(PaddleConfig cfg)
   at Sdcb.PaddleInference.PaddleConfig.Apply(Action`1 configure)
   at Sdcb.PaddleOCR.OcrBaseModel.ConfigureDevice(PaddleConfig config, Action`1 configure)
   at Sdcb.PaddleOCR.PaddleOcrDetector..ctor(DetectionModel model, Action`1 configure)
   at Sdcb.PaddleOCR.PaddleOcrAll..ctor(FullOcrModel model, Action`1 device)
   at Program.Main() in D:\project\OcrServer\Program.cs:line 21
   at Program.<Main>()

Steps to reproduce the bug

Windows 11 22H2 + GeForce RTX 5070 Ti

安装驱动 551.xx、CUDA 12.9、cuDNN 9.1.0、TensorRT 10.12.0(DLL 已放入 PATH)

dotnet add package Sdcb.PaddleInference.runtime.win64.cu129_cudnn910_sm120

代码中链式调用 PaddleDevice.Gpu().And(PaddleDevice.TensorRt(...))

运行 dotnet run 立即报错

Expected behavior

No response

Screenshots

No response

Release version

No response

IDE

No response

OS version

No response

Additional context

No response

EndSmile avatar Jun 20 '25 16:06 EndSmile

TensorRT现在不支持,要用的话可以用老版本的 就算支持,也不是你那样用的,TensorRT跑OCR的话,3个模型的shape都不是同的,你只指定了一个

sdcb avatar Jun 20 '25 23:06 sdcb

那我就先用gpu的版本吧,感谢

EndSmile avatar Jun 21 '25 03:06 EndSmile

TensorRT现在不支持,要用的话可以用老版本的 就算支持,也不是你那样用的,TensorRT跑OCR的话,3个模型的shape都不是同的,你只指定了一个

i'm not very good at english, and when i read the sample i still don't understand where det.txt and rec.txt can be taken from? or will it be automatically generated? do you have any detailed sample code or more detailed instructions for using Tensor RT? installed the correct Paddle version for cuda11.8 cudnn 8.9 (supports Tensor RT as PaddleOCR said). installed all dll cuda, cudnn, cublas, tensor rt in the software folder. version .net 9, operating system Windows 11 thanks...

andreyk6617 avatar Aug 25 '25 20:08 andreyk6617

I believe the error indicates the version of paddle_inference_c.dll that ships with the current CUDA nuget packages was built without the WITH_TENSORRT option.

@sdcb is it possible this happened? I'm looking into how to build paddle myself right now

ThomasTheGerman avatar Sep 12 '25 13:09 ThomasTheGerman

this version didn't compile with WITH_TENSORRT, you can compile it yourself

sdcb avatar Sep 15 '25 01:09 sdcb