Blake Wyatt
Blake Wyatt
Thanks @NenadZG. I've updated my instructions with your GPTQ rollback fix.
Good to know that's possible. I'll update my instructions when all versions of the model have become requantized.
> Ok I got it > > 1. Start over > > > ``` > conda deactivate > conda remove -n textgen --all > conda create -n textgen python=3.10.9 >...
@gsgoldma I ran into this error as well. Your CUDA version is 12.0 which isn't compatible with your PyTorch version 11.7. You need to downgrade your CUDA version to one...
@Fenfel I'd delete your environment, files, and start over. I've been able to get everything working correctly on Windows. I put some instructions [here](https://github.com/xNul/chat-llama-discord-bot#llama-setup-normal8bit4bit-for-text-generation-webui) which may help.
> > @Fenfel I'd delete your environment, files, and start over. I've been able to get everything working correctly on Windows. I put some instructions [here](https://github.com/xNul/chat-llama-discord-bot#llama-setup-normal8bit4bit-for-text-generation-webui) which may help. >...
I got this error a few hours ago and got rid of it somehow. [These](https://github.com/xNul/chat-llama-discord-bot#llama-setup-normal8bit4bit-for-text-generation-webui) steps ended up working for me on Windows without WSL. I'd delete your environment, files,...
I had this problem too (Windows without WSL) and I think it was because the script installs a different version of cuda than PyTorch 2.0.0 supports. My solution was to...
Thank you, I think this is due to the update that 'fixed' the guild bug and because of how it's being applied, it's very error-prone. I'm working on a better...
I think https://github.com/xNul/palworld-host-save-fix/issues/69 is where you want to look. I'll close this issue as a duplicate. Please use that issue for any further discussion.