private-gpt
private-gpt copied to clipboard
Can I use my Radeon 5500 for private gpt?
I am using the Local Ollama-powered setup, and it doesn't use my gpu at all, and is really slow. I am wondering if I can use my gpu in some way to give faster results and take some of the load off of my cpu. Thanks in advance!
Maybe this one can help you out. https://github.com/ROCm/ROCm/discussions/2629 Not sure which 5500 you have.
Is it issue