FastDeploy
FastDeploy copied to clipboard
C#部署FastDeploy的问题
环境
- 【FastDeploy版本】:fastdeploy-win-x64-gpu-1.0.6
- 【编译命令】使用CMake3.24.1
- 【系统平台】: Windows x64(Windows10)
- 【硬件】:Nvidia GeForce RTX3060 Laptop GPU,
- 【环境配置】 CUDA 11.2, CUDNN 11.3,TensorRT-8.4.1.5
- 【编译语言】: C#
cmake编译时设置如下
第一次编译成功时未勾选WITH_CAPI 第二次编译成功时勾选了WITH_CAPI 编译成功后,以下为测试代码
第一次没勾选WITH_CAPI,测试报错找不到函数FD_C_CreateRuntimeOptionWrapper入口, 第二次勾选WITH_CAPI,测试报错试图加载格式不正确的程序,或者报错找不到fastdeploy.dll。我检查过了,各个项目都设置的X64, C#项目的平台都设置的Release,选的.net都是.Net Framework4.7或者4.8,Nuget包opencvSharp4用的是最新的4.7.0.20230115,目前尚未找到解决办法
I'm sorry, i don't undestand Chinese, but i think i have some problem. I'm trying to build and use "fastdeploy_sharp.dll". Build is successful (Release, x64, Win10). But after start at the same code line (as in post above) i got System.DllNotFoundException: Unable to load DLL "fastdeploy.dll"
But fastdeploy.dll is in project working directory. It's build with CSHARP_API and C_API on.
I tried to build fastdeploy_sharp.dll (and other fastdeploy libs) from command line, from VS2019, from VS2022 for framework 4.8. I tried to manualy remake project for Net5 and Net6. But i got DllNotFoundException every time.
Am i doing something wrong? Or may be something changed in fastdeploy.dll project?
我到这步,也是推理没结果,好像模型初始化没问题,到infer就退出,不往下执行了
Paddleseg预测结果后在释放c++传回的结果内存时会卡住 把释放方法注释掉 在c#代码中释放内存 正常运行 由于c++不太熟悉 不知道这样释放内存会不会有问题
c++中释放内存代码
请问有使用c#的TensorRT吗
@zhinangubei 我也碰到了同样的问题,请问你解决了吗
我到这步,也是推理没结果,好像模型初始化没问题,到infer就退出,不往下执行了
请问您解决了吗
研究了一下代码,和 @zhcco 描述的差不多,内存释放出问题了,注释掉释放内存的代码,推理有数据返回,但通过接口返回给C#里就没了数据,C#代码里得到的是空对象,这一步没整明白,C++不太熟悉