[Question]: RAGFlow failed to connect to the local Ollama model
Self Checks
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (Language Policy).
- [x] Non-english title submitions will be closed directly ( 非英文标题的提交将会被直接关闭 ) (Language Policy).
- [x] Please do not modify this template :) and fill in all the required fields.
Describe your problem
Ollama pull models to the local system successfully and connection test is successful between RAGFlow and Ollama service. But still fail to add the Ollama local model in RAGFlow V0.18 as below error messages: (If I deployed ollama by docker, I could add the ollama local model successfully in RAGFlow V0.17) Please let me know how to resolve this problem, thanks!
I found a similar open issue where users experienced connection problems between RAGFlow and the local Ollama model. Here are some suggested solutions that might help resolve your issue:
-
Ensure Ollama is Accessible: Make sure that Ollama is running and accessible by setting the
OLLAMA_HOSTenvironment variable to the private IP address of the server where Ollama is installed. This allows it to be exposed beyond the loopback address (127.0.0.1), which RAGFlow cannot reach [1]. -
Docker Configuration: If you are using Docker, add the following to your
docker-composefile to ensure proper networking:ollama: image: ollama/ollama ports: - 11434:11434 environment: - OLLAMA_HOST=0.0.0.0 networks: - ragflow restart: alwaysUse the command
docker inspect ollamato find the IP address assigned to the Ollama container and use this IP address in the RAGFlow UI as the base URL, e.g.,http://172.xx.xx.xx:11434[1]. -
Windows Specific Solution: If running on Windows with Docker, use
http://host.docker.internal:11434as the connection URL in the Ollama setup. If this does not work on Linux, modify thedocker-compose.ymlby addingextra_hosts:extra_hosts: - "host.docker.internal:host-gateway"Ensure that the firewall settings allow traffic on port 11434 [1].
-
RAGFlow Configuration: In RAGFlow v0.18, ensure that the model name and type match those you pulled when deploying Ollama using Docker. Set the Ollama base URL to the correct address, such as
http://host.docker.internal:11434/v1,http://localhost:11434/v1, orhttp://${IP_OF_OLLAMA_MACHINE}:11434/v1[2]. -
API Token Workaround: If the Ollama chat model is not appearing in the system model settings, try adding any random string in the API Token field when adding the Ollama model, even though it's not actually required for local deployment. This workaround addresses a bug in newer versions of RAGFlow when using locally deployed Ollama models [2].
These steps should help you resolve the connection issue between RAGFlow and the local Ollama model.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
When configuring the Ollama model in your model provider, the model name must exactly match what’s shown by ollama list. Therefore, the correct model name should be: bge-m3:latest
Switch localhost to 127.17.0.1. Give it a try.
Thanks, Kevin @KevinHuSh @dosu
but fail to access ollama service from ragflow server as below that would not be the real success, right?
To resolve the "Connection refused" error when accessing the Ollama service from the RAGFlow server, ensure that the port configurations in 'docker/service_conf.yaml.template' match those in 'docker/.env'. Use the Docker host IP address '172.17.0.1' instead of 'localhost' for the connection. The complete endpoint URL should be 'http://172.17.0.1:
If RAGFlow runs in Docker, the localhost is mapped within the RAGFlow Docker container as host.docker.internal. If Ollama runs on the same host machine, the correct URL to use for Ollama would be http://host.docker.internal:11434/. You can verify Ollama's accessibility from inside the RAGFlow container by executing:
$ sudo docker exec -it ragflow-server bash
$ curl http://host.docker.internal:11434/
If RAGFlow and Ollama run on different machines, ensure Ollama is accessible from RAGFlow's host machine with:
$ curl http://${IP_OF_OLLAMA_MACHINE}:11434/
Ensure that Ollama and RAGFlow are in the same LAN and can communicate with each other. If you are using demo.ragflow.io, make sure the server hosting Ollama has a publicly accessible IP address, as 127.0.0.1 is not publicly accessible [1].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
Pass with 192.168.1.26 replacing 127.17.0.1
127.17.0.1 is a fake ip and 192.168.1.26 is the real IP assigned to my wifi that can access to internet.
Pass with 192.168.1.26 replacing 127.17.0.1
127.17.0.1 is a fake ip and 192.168.1.26 is the real IP assigned to my wifi that can access to internet.
same issue & same solution
use ifconfig | grep "192.168" to see local ip btw