libbitsandbytes using older CUDA 11.6
Hi. When I am training a stable diffusion 2.1 768px model, I see this line in my CLI under Windows and Conda env: CUDA SETUP: Loading binary....... libbitsandbytes_cuda116.dll
I wonder why an older CUDA verison is used here, since I have installed CUDA 11.8, torch 1.11.3 with CUDA 11.7 support (torch 1.13.0+cu117), and even bitsandbytes 0.35.0 (which I have to use for 8-bit Adam) supports CUDA 11.8.
I am using an RTX 4080 16GB.
What can I change to use a newer CUDA version for training and inference? At least 11.7?
Interestingly today it's using libbitsandbytes_cudaall.dll. Don't really know what changed... How can I influence what version is used? I think I re-ran accelerate config for the venv. I guess this could be the reason?
Sorry I'm a bit lost here - what is the issue? You should be able to use CUDA 11.7 without any problems.
I figured out what was going on. Since bitsandbytes unfortunately does not support windows officially, a dreambooth extension (https://github.com/d8ahazard/sd_dreambooth_extension/tree/main/bitsandbytes_windows) I installed earlier in the same conda venv made this work for me. This extension "hacks" the bitsandbytes library and also puts a precompiled dll there. Upgrading bitsandbytes removed that "hack" and led to this error. Since this has nothing to do with the diffusers repo, I'll close this issue.