Deep-Live-Cam
Deep-Live-Cam copied to clipboard
CUDA_PATH is set but CUDA wasn't able to be loaded [RTX 5070TI]
I'm not sure what I'm doing wrong but I am running Python 3.10, CUDA V11.8 + cudnn 8.5.0.96 but am still getting the above error. Here is an image of the environment variables.
Error Log
(venv) C:\Users\User\Documents\GitHub\Deep-Live-Cam>python run.py --execution-provider cuda
2025-05-23 22:41:38.3929911 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\User\Documents\GitHub\Deep-Live-Cam\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
EP Error D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.
when using ['CUDAExecutionProvider']
Falling back to ['CUDAExecutionProvider', 'CPUExecutionProvider'] and retrying.
2025-05-23 22:41:38.4546697 [E:onnxruntime:Default, provider_bridge_ort.cc:1480 onnxruntime::TryGetProviderInfo_CUDA] D:\a\_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1193 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\Users\User\Documents\GitHub\Deep-Live-Cam\venv\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"
Exception in Tkinter callback
Traceback (most recent call last):
File "C:\Users\User\Documents\GitHub\Deep-Live-Cam\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 419, in __init__
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "C:\Users\User\Documents\GitHub\Deep-Live-Cam\venv\lib\site-packages\onnxruntime\capi\onnxruntime_inference_collection.py", line 463, in _create_inference_session
sess.initialize_session(providers, provider_options, disabled_optimizers)
RuntimeError: D:\a\_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:743 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasn't able to be loaded. Please install the correct version of CUDA and cuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.