FreedomGPT
FreedomGPT copied to clipboard
Can it use GPU on the windows app?
I saw that it's using the cpu only which is slower. Can you implement a way to choose if the user wants to use cpu or gpu?
Bump^
that's could be greeat to be able to use GPU instead of CPU for sure.
I saw that it's using the cpu only which is slower. Can you implement a way to choose if the user wants to use cpu or gpu?
Unfortunately, chat.exe / chat is compiled without CUDA support. I have no idea why.
You could load the Alpaca model in KoboldAI and use it with GPU there.
You would need the entire structure from https://huggingface.co/Sosaka/Alpaca-native-4bit-ggml/tree/main
in KoboldAI\models directory under a subfolder named Alpaca-native-4bit-ggml