[Question]: RAGFLOW Fails to Add Ollama Model via add_llm() While Direct curl Call Succeeds
Describe your problem
Description I am encountering an issue where adding an Ollama model as a service provider in RAGFLOW fails, despite being able to successfully call the same Ollama server using a direct curl command.
curl -X POST http://ihrtn.cminl.oa/ollama/api/generate -H "Content-Type: application/json" -d '{"model": "deepseek-r1:32b","prompt": "Please help me write a song","max_tokens": 1000,"stream": false,"decode": true}'
This returns a valid response, indicating the Ollama server and model (qwq:latest) are working correctly.
Failed RAGFLOW Configuration: In RAGFLOW, I attempt to add the model using the "Add Model Provider" feature with the following configuration:
post { "model_type": "chat", "llm_name": "deepseek-r1:32b", "api_base": "http://ihrtn.cminl.oa/ollama", "api_key": "", "max_tokens": 8192, "llm_factory": "Ollama" }
error: hint : 102 Fail to access model(deepseek-r1:32b).ERROR:
404 Not Found
here is my ragflow-server log
I have tried another RAG service (AnythingLLM) in a different container on the same Linux server, and it was able to successfully add the LLM model. This should rule out network or Docker container issues, right? Please refer to the image below.
I am sure, there must be a connection issue. Ragflow (docker) has a built in network. Try entering the ragflow-server docker container with docker exec -it ragflow-server (or your container name). Try pinging your URL from inside the container (and curl, if installed). It might also be a DNS-Resolve issue. Some containers are pre-configured for local use and you might need to fiddle with network configurations within docker to get access.
I am sure, there must be a connection issue. Ragflow (docker) has a built in network. Try entering the ragflow-server docker container with docker exec -it ragflow-server (or your container name). Try pinging your URL from inside the container (and curl, if installed). It might also be a DNS-Resolve issue. Some containers are pre-configured for local use and you might need to fiddle with network configurations within docker to get access.
I have already tried entering the container. Using the curl command to make a post request to the LLM inside the container was successful.
I faced the same issue , how to fix it?
Fixed by #5947