web-ui
web-ui copied to clipboard
Error model requires more system memory but it works loading locally with Ollama run
Hi there,
I am trying to run the ollama models (locally), but it is complaining about not enough memory. I can run the models using ollama command without issues. Can someone tell me if I am doing something wrong or how I can fix the issue?
Thank you
API endpoint for ollama: http://localhost:11434
+1 ..were you able to resolve this? I cannot use llama2:7b
no, i didn't get a solution unfortunately
I have the same issues on my Windows PC. Please, can anybody help with this?
Same issue here too!