gpt4all icon indicating copy to clipboard operation
gpt4all copied to clipboard

recomennded hardware for GPU inference

Open imwide opened this issue 1 year ago • 0 comments

I want to buy the nessecary hardware to load and run this model on a GPU through python at ideally about 5 tokens per second or more. What GPU, ram, and CPU do you recommend? (I want to make an API for personal use) My budget is about 1000€

imwide avatar Apr 15 '23 18:04 imwide