[Question]: Trouble connect to ollama
Describe your problem
hint : 102 Fail to access model(ollama).ERROR: [Errno 111] Connection refused
I double checked that http://localhost:11434/ shows "Ollama is running"
I'm not sure what to do now, any tips?
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}"
if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf
I deploy ragflow by docker on my server and ollama also serve on the server. The baseURL was set http://0.0.0.0:11434 and http://public-IP:11434. Both of them didn't work. And suggestions? HELP!
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
Since the networks of the ragflow docker and the host server are different, 127.0.0.1 or 0.0.0.0 wouldn't work.
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
Since the networks of the ragflow docker and the host server are different, 127.0.0.1 or 0.0.0.0 wouldn't work.
Please star RAGFlow.
I probably didn't write my thought in great detail. The point is that ollama needs to specify the ip it will follow and this is done in this file:
/etc/systemd/system/ollama.service.d/override.conf
Until I wrote the path, nothing worked for me.
sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}"
if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf
I probably didn't write my thought in great detail. The point is that ollama needs to specify the ip it will follow and this is done in this file:
/etc/systemd/system/ollama.service.d/override.conf
Until I wrote the path, nothing worked for me.
sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}" if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf
I do not have this file/path. I did find /etc/systemd/system/ollama.service but not /etc/systemd/system/ollama.service.d/override.conf
This is the content of that ollama.service file
should i try to replace the line Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games"
with Environment="OLLAMA_HOST=192.168.x.x"
(By the way, I am super grateful for your help and knowledge, I am brand new to self hosting and I really can't believe how helpful you and the other FOSS/Selfhost community members are. It is really incredible)
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434
Okay I have done the following:
- confirmed my local IP as 192.168.1.x
- navigated to /user-setting/model in the RAGFLOW UI
- Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)
The Error is persistent. Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refused
One thing i did notice is that when i visit http://localhost:11434/ in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"
Does this makes me believe that i need to configure ollama to run on my local network somehow?
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434
Okay I have done the following:
- confirmed my local IP as 192.168.1.x
- navigated to /user-setting/model in the RAGFLOW UI
- Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)
The Error is persistent.
Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refusedOne thing i did notice is that when i visit
http://localhost:11434/in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"Does this makes me believe that i need to configure ollama to run on my local network somehow?
Use http://host.docker.internal:11434
One thing i did notice is that when i visit
http://localhost:11434/in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"
it's works for me, thank you @Nehcknarf
Problem: When attempting to add an Ollama model, I encountered the error Fail to access model(ollama). ERROR: [Errno 111] Connection refused. Upon investigation, I found that while curl http://localhost:11434 worked correctly, curl http://172.16.xx.xxx:11434 did not, indicating the service was only listening on localhost.
Diagnosis: Running the following command confirmed the issue:
sudo netstat -tuln | grep 11434
Output:
tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN
This indicated that the service was bound to 127.0.0.1 instead of 0.0.0.0.
Solution: To resolve this, you need to configure the service to listen on all network interfaces by following these steps:
Edit the Ollama service configuration:
sudo systemctl edit ollama.service
Add the environment variable to bind to all interfaces: In the editor that opens, add the following lines under the [Service] section:
[Service] Environment="OLLAMA_HOST=0.0.0.0:11434"
Reload systemd and restart the Ollama service:
sudo systemctl daemon-reload sudo systemctl restart ollama
Verify the changes:
sudo netstat -tuln | grep 11434
Output should now show:
tcp6 0 0 :::11434 :::* LISTEN
This configuration change allows the Ollama service to listen on all network interfaces, resolving the connection issue.
Output:
amazing! thank you, this is exactly what I needed!
As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?
Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?
On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434
Okay I have done the following:
- confirmed my local IP as 192.168.1.x
- navigated to /user-setting/model in the RAGFLOW UI
- Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)
The Error is persistent.
Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refusedOne thing i did notice is that when i visithttp://localhost:11434/in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server" Does this makes me believe that i need to configure ollama to run on my local network somehow?Use http://host.docker.internal:11434
Wow! Worked like a charm!!! Thank you.
I miss the same issue,and solved it by using http://host.docker.internal:11434/ , while Ollama and RAGFlow run on the same machine and just RAGFlow is in Docker.
Try this IP: 1722.17.0.1, and check out whether the proxy is on or not. Sometimes, http proxy will block the connections.