A problem occurred when I tried to use C++ to deploy TensorRT+cuda under windows. Device "cuda" not found.
Checklist
- [X] I have searched related issues but cannot get the expected help.
- [X] 2. I have read the FAQ documentation but cannot get the expected help.
- [ ] 3. The bug has not been fixed in the latest version.
Describe the bug
I first followed the tutorial to compile the cuda+TensorRT SDK and Demos, and used the Demo to perform corresponding model inference work. I configured the files in build/include and build/lib in VS2019 respectively. But when I try to customize the demo, I get an error.Device "cuda" not found.
Reproduction
#include
int main() { const char* device_name = "cuda"; const char* model_path = "D:\Users\learn\biyesheji\deploytensorrt\mmdeploy\work"; const char* image_path = "D:\Users\learn\biyesheji\deploytensorrt\mmdetection\demo\demo.jpg"; mmdeploy_detector_t detector{}; int status{}; status = mmdeploy_detector_create_by_path(model_path, device_name, 0, &detector); if (status != MMDEPLOY_SUCCESS) { fprintf(stderr, "failed to create detector, code: %d\n", (int)status); return 1; }
std::cout << "Hello World!\n";
}
Environment
My environment is Windows10+VS2019+CUDA11.1+CUDNN8.6.0
Error traceback
[2024-05-14 17:15:36.662] [mmdeploy] [info] [model.cpp:35] [DirectoryModel] Load model: "D:\Users\learn\biyesheji\deploytensorrt\mmdeploy\work"
[2024-05-14 17:15:36.664] [mmdeploy] [error] [common.cpp:67] Device "cuda" not found
failed to create detector, code: 1
I also encountered a similar problem. Have you solved it?
Same problem.
Same problem. Have you solve that?
同样的问题,请问解决了吗?