onnxruntime icon indicating copy to clipboard operation
onnxruntime copied to clipboard

Onnxruntime LoadLibrary failed with error 126

Open Sumphy-ai opened this issue 1 year ago • 6 comments

Describe the issue

When I'm using Stable Diffusion Auto1111 WebUI with ControlNet IP-Adapter and ip-adapter-faceid-plus v2 created by h94 on hugginface, I keep getting the following error message :

[E:onnxruntime:Default, provider_bridge_ort.cc:1745 onnxruntime::TryGetProviderInfo_CUDA] D:\a_work\1\s\onnxruntime\core\session\provider_bridge_ort.cc:1426 onnxruntime::ProviderLibrary::Get [ONNXRuntimeError] : 1 : FAIL : LoadLibrary failed with error 126 "" when trying to load "C:\sd.webui\system\python\lib\site-packages\onnxruntime\capi\onnxruntime_providers_cuda.dll"

PS : I'm not dev, not a native english speaker, I followed tutos and solutions proposed here but may have made mistakes.

To reproduce

Launch Stable Diffusion Auto1111 WebUI and generate a 832x1216px pic using JuggernautX model and DPM++ 2M Karras sampler with 4.5 CFG and 30 steps. Activate ControlNet IP-Adapter and load any preprocessor and ip-adapter-faceid-plusv2_sdxl. Start the generation.

Urgency

Issue is not that urgent since there is no client involved, but I'm stuck in my project development.

Platform

Windows

OS Version

10

ONNX Runtime Installation

Other / Unknown

ONNX Runtime Version or Commit ID

CUDA 12

ONNX Runtime API

Python

Architecture

Other / Unknown

Execution Provider

CUDA

Execution Provider Library Version

No response

Sumphy-ai avatar Jul 25 '24 08:07 Sumphy-ai

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

skottmckay avatar Jul 25 '24 10:07 skottmckay

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

Thank you for your quick answer, I'm going to reinstall them being very careful.

About the PATH thing, is it related to the parameters in the .bat file that launches the app, or shall I just check if variables are present in the folder indicated in the error message ?

Sumphy-ai avatar Jul 25 '24 10:07 Sumphy-ai

Here is a feedback, I followed the instructions to install the compatible CUDA and cuDNN, and even TensorRT since some people seem to say it was required and i didn't have it.

But the problem remains, and I now have this message :

D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:891 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

So I guess I installed the wrong versions (that I thought were the right one according to requirements doc...)

How can I be sure what versions go with my program ?

Sumphy-ai avatar Jul 25 '24 18:07 Sumphy-ai

Here is a feedback, I followed the instructions to install the compatible CUDA and cuDNN, and even TensorRT since some people seem to say it was required and i didn't have it.

But the problem remains, and I now have this message :

D:\a_work\1\s\onnxruntime\python\onnxruntime_pybind_state.cc:891 onnxruntime::python::CreateExecutionProviderInstance CUDA_PATH is set but CUDA wasnt able to be loaded. Please install the correct version of CUDA andcuDNN as mentioned in the GPU requirements page (https://onnxruntime.ai/docs/execution-providers/CUDA-ExecutionProvider.html#requirements), make sure they're in the PATH, and that your GPU is supported.

So I guess I installed the wrong versions (that I thought were the right one according to requirements doc...)

How can I be sure what versions go with my program ?

I'm also facing the same issue https://github.com/microsoft/onnxruntime/issues/21527

Noor-Nizar avatar Jul 27 '24 03:07 Noor-Nizar

I encountered the same issue, but strangely, the same environment works fine on a Linux system. Is there any way to force it to use the GPU for inference? This way, I can determine which DLL is missing.

Follow the installation instructions for CUDA and cuDNN very very precisely. Make sure the libraries from both are available in the PATH that is set when the load failure occurs.

You MUST use the instructions for the specific version of CUDA and cuDNN. e.g. some versions of cuDNN require zlibwapi.dll to be installed.

GilbertPan97 avatar Jul 28 '24 07:07 GilbertPan97

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

github-actions[bot] avatar Aug 27 '24 15:08 github-actions[bot]

Sorry, I'm having the same problem, I installed the following software version:

Microsoft.ML.OnnxRuntime.Gpu: v1.20.0 CUDA:11.8 cudnn:8.9.7.29_cuda11-archive

Image

Image

Is there any update solution here?

Sola-AIGithub avatar Nov 21 '24 03:11 Sola-AIGithub

same issue anyone got the solution?

Jeremyop avatar May 31 '25 08:05 Jeremyop

This issue has been automatically closed as 'not planned' because it has been marked as 'stale' for more than 30 days without activity. If you believe this is still an issue, please feel free to reopen it.

snnn avatar Jun 07 '25 22:06 snnn