ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]: RAGFLOW Fails to Add Ollama Model via add_llm() While Direct curl Call Succeeds

Open killeress opened this issue 9 months ago • 4 comments

Describe your problem

Description I am encountering an issue where adding an Ollama model as a service provider in RAGFLOW fails, despite being able to successfully call the same Ollama server using a direct curl command.

curl -X POST http://ihrtn.cminl.oa/ollama/api/generate -H "Content-Type: application/json" -d '{"model": "deepseek-r1:32b","prompt": "Please help me write a song","max_tokens": 1000,"stream": false,"decode": true}'

Image

This returns a valid response, indicating the Ollama server and model (qwq:latest) are working correctly.

Failed RAGFLOW Configuration: In RAGFLOW, I attempt to add the model using the "Add Model Provider" feature with the following configuration:

post { "model_type": "chat", "llm_name": "deepseek-r1:32b", "api_base": "http://ihrtn.cminl.oa/ollama", "api_key": "", "max_tokens": 8192, "llm_factory": "Ollama" }

error: hint : 102 Fail to access model(deepseek-r1:32b).ERROR:

404 Not Found

404 Not Found


nginx

Image

here is my ragflow-server log

Image

killeress avatar Mar 10 '25 07:03 killeress

I have tried another RAG service (AnythingLLM) in a different container on the same Linux server, and it was able to successfully add the LLM model. This should rule out network or Docker container issues, right? Please refer to the image below.

Image

killeress avatar Mar 10 '25 08:03 killeress

I am sure, there must be a connection issue. Ragflow (docker) has a built in network. Try entering the ragflow-server docker container with docker exec -it ragflow-server (or your container name). Try pinging your URL from inside the container (and curl, if installed). It might also be a DNS-Resolve issue. Some containers are pre-configured for local use and you might need to fiddle with network configurations within docker to get access.

Snify89 avatar Mar 10 '25 09:03 Snify89

I am sure, there must be a connection issue. Ragflow (docker) has a built in network. Try entering the ragflow-server docker container with docker exec -it ragflow-server (or your container name). Try pinging your URL from inside the container (and curl, if installed). It might also be a DNS-Resolve issue. Some containers are pre-configured for local use and you might need to fiddle with network configurations within docker to get access.

I have already tried entering the container. Using the curl command to make a post request to the LLM inside the container was successful.

killeress avatar Mar 11 '25 00:03 killeress

I faced the same issue , how to fix it?

z1s8h5 avatar Mar 11 '25 07:03 z1s8h5

Fixed by #5947

JinHai-CN avatar Mar 12 '25 06:03 JinHai-CN