meng-hui
meng-hui
@MandarUkrulkar check that you changed both `PGPT_OLLAMA_API_BASE` and `PGPT_OLLAMA_EMBEDDING_API_BASE` to use `http://host.docker.internal:11434` You might also need to run `ollama pull nomic-embed-text` and `ollama pull llama3.2` beforehand because pulling the model...
@jaluma thanks for the reply. Indeed I did not have `OLLAMA_HOST=0.0.0.0` set. That resolves 403. In this thread there is also a 503, which seems to be because traefik is...
I have noticed that stopping the summarize prompt in the UI also does not stop the model from generating outputs in the trace. I'm using Ollama API