text-generation-webui
text-generation-webui copied to clipboard
Not using CUDA
I am running windows 10, and ran the install script with no errors, however running the start-webui script it wont find my gpu (RTX 3060 Ti) running conda list reveals no installed cuda package even though I selected Nvidia when installing. Running nvcc also doesnt find anything and ive had a similar issue so I believe its something to do with the path, though I dont understand that. Also checking the torch version reveals 2.0.0+cpu
Try re-downloading the script and starting over, it was updated yesterday
https://github.com/oobabooga/one-click-installers/archive/refs/heads/oobabooga-windows.zip
I downloaded it about 3 hours ago, it should be up to date
Im not familiar with conda, how would I tell it to install cuda and also torch gpu
Also running into this issue. I'm using a rtx 3060 with Windows 11 and just downloaded the script an hour ago.
I have the same issue 3060 ti windows 10, have 3 fresh installs
Pretty sure it has something to do with cuda, specifically the version and how conda uses it, I'll probably have another go at it later today
Can you try WSL? There are some 40 issues about CUDA on Windows. WSL should be a smoother experience.
See here the new instructions: https://github.com/oobabooga/text-generation-webui#installation
Ill try it later, currently experimenting with conda and installing cuda, wondering if I can install cuda 11.7 before running the main install script, then trying to do the same with pytorch making sure its compiled for gpu, then if that all fails ill let yk and try wsl
interesting news, from clean install I installed miniconda first, then conda cuda 11.7, and then installed pytorch cuda. Before I would run torch.cuda.is_available() and it would return false, and now it returns true, next step is to download pygmalion and test it out completely (wish me luck)
Can you try WSL? There are some 40 issues about CUDA on Windows. WSL should be a smoother experience.
See here the new instructions: https://github.com/oobabooga/text-generation-webui#installation
I ended up getting this to work after using WSL...kinda. Now having an issue similar to this https://github.com/oobabooga/text-generation-webui/issues/41
I set the RAM limit to 16GB in a config file but it's still getting killed around 80% GPU usage. This is the model I'm attempting to use: https://huggingface.co/chavinlo/alpaca-native
I had this problem too (Windows without WSL) and I think it was because the script installs a different version of cuda than PyTorch 2.0.0 supports. My solution was to use the PyTorch version right before 2.0.0 and it got me past the problem. I ran into some other problems along the way and it ended up taking hours to work through and debug. I wrote some instructions to help.
This issue has been closed due to inactivity for 30 days. If you believe it is still relevant, please leave a comment below.