MoneyPrinterTurbo
MoneyPrinterTurbo copied to clipboard
Ollama Set Up
Running ollama on localhost:11434, I get the following error:
Here is the LLM Setup:
And here is the ollama service running:
Any help would be greatly appreciated!!
Are you deploying with Docker?
If it's deployed with Docker, the base URL should be http://host.docker.internal:11434/
Ok that worked, however now in the Ollama logs the call to /chat/completions returns a 404 error
The full base url should be:
http://host.docker.internal:11434/v1
The same error shows up, should it not be calling the Ollama api with /api/generate?
you should run command ollama list
to check the models you installed.
and then set a valid model name.