gpt4all
gpt4all copied to clipboard
Cpu vs gpu and vram
I'm using gpt4all on a ryzen 5 6600 and 32gb of ram but I still find it quite slow. Would i get faster results on a gpu version? I only have a 3070 with 8gb of ram so, is it even possible to run gpt4all with that gpu?
@pezou45
Note: the full model on GPU (16GB of RAM required) performs much better in our qualitative evaluations.
Said in the README
Indeed ty, but is there any workaround to make it work with a 8gb gpu? I imagine results would still be better than cpu but I didn't see anywhere that it was possible to run it on less than 12gb
try the model with CPU, will work really fast. Also try changing the threads to 4 instead of 8.
Stale, please open a new, updated issue if this is still relevant.