ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]: Trouble connect to ollama

Open homegrownhrbs opened this issue 1 year ago • 12 comments

Describe your problem

hint : 102 Fail to access model(ollama).ERROR: [Errno 111] Connection refused

image

I double checked that http://localhost:11434/ shows "Ollama is running" image

I'm not sure what to do now, any tips?

homegrownhrbs avatar Apr 24 '24 22:04 homegrownhrbs

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

KevinHuSh avatar Apr 25 '24 00:04 KevinHuSh

sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}"

if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf

sensoryfox avatar Apr 25 '24 01:04 sensoryfox

I deploy ragflow by docker on my server and ollama also serve on the server. The baseURL was set http://0.0.0.0:11434 and http://public-IP:11434. Both of them didn't work. And suggestions? HELP!

calchemist avatar Apr 25 '24 01:04 calchemist

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

homegrownhrbs avatar Apr 25 '24 02:04 homegrownhrbs

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434

calchemist avatar Apr 25 '24 04:04 calchemist

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

Since the networks of the ragflow docker and the host server are different, 127.0.0.1 or 0.0.0.0 wouldn't work.

calchemist avatar Apr 25 '24 04:04 calchemist

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

Since the networks of the ragflow docker and the host server are different, 127.0.0.1 or 0.0.0.0 wouldn't work.

Please star RAGFlow.

KevinHuSh avatar Apr 25 '24 06:04 KevinHuSh

I probably didn't write my thought in great detail. The point is that ollama needs to specify the ip it will follow and this is done in this file:

/etc/systemd/system/ollama.service.d/override.conf

Until I wrote the path, nothing worked for me.

sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}"

if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf

sensoryfox avatar Apr 25 '24 16:04 sensoryfox

I probably didn't write my thought in great detail. The point is that ollama needs to specify the ip it will follow and this is done in this file:

/etc/systemd/system/ollama.service.d/override.conf

Until I wrote the path, nothing worked for me.

sudo EDITOR=nano systemctl edit ollama.service [Service] Environment="OLLAMA_HOST={your_local_ip}" if changes not accept do in this file: sudo nano /etc/systemd/system/ollama.service.d/override.conf

I do not have this file/path. I did find /etc/systemd/system/ollama.service but not /etc/systemd/system/ollama.service.d/override.conf

image

This is the content of that ollama.service file image

should i try to replace the line Environment="PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games" with Environment="OLLAMA_HOST=192.168.x.x"

(By the way, I am super grateful for your help and knowledge, I am brand new to self hosting and I really can't believe how helpful you and the other FOSS/Selfhost community members are. It is really incredible)

homegrownhrbs avatar Apr 25 '24 17:04 homegrownhrbs

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434

Okay I have done the following:

  1. confirmed my local IP as 192.168.1.x
  2. navigated to /user-setting/model in the RAGFLOW UI
  3. Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)

The Error is persistent. Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refused


One thing i did notice is that when i visit http://localhost:11434/ in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"

Does this makes me believe that i need to configure ollama to run on my local network somehow?

homegrownhrbs avatar Apr 25 '24 17:04 homegrownhrbs

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434

Okay I have done the following:

  1. confirmed my local IP as 192.168.1.x
  2. navigated to /user-setting/model in the RAGFLOW UI
  3. Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)

The Error is persistent. Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refused

One thing i did notice is that when i visit http://localhost:11434/ in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"

Does this makes me believe that i need to configure ollama to run on my local network somehow?

Use http://host.docker.internal:11434

Nehcknarf avatar Apr 28 '24 10:04 Nehcknarf

One thing i did notice is that when i visit http://localhost:11434/ in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server"

it's works for me, thank you @Nehcknarf

pengyonglei avatar May 19 '24 13:05 pengyonglei

Problem: When attempting to add an Ollama model, I encountered the error Fail to access model(ollama). ERROR: [Errno 111] Connection refused. Upon investigation, I found that while curl http://localhost:11434 worked correctly, curl http://172.16.xx.xxx:11434 did not, indicating the service was only listening on localhost.

Diagnosis: Running the following command confirmed the issue:

sudo netstat -tuln | grep 11434

Output:

tcp 0 0 127.0.0.1:11434 0.0.0.0:* LISTEN

This indicated that the service was bound to 127.0.0.1 instead of 0.0.0.0.

Solution: To resolve this, you need to configure the service to listen on all network interfaces by following these steps:

Edit the Ollama service configuration:

sudo systemctl edit ollama.service

Add the environment variable to bind to all interfaces: In the editor that opens, add the following lines under the [Service] section:

[Service] Environment="OLLAMA_HOST=0.0.0.0:11434"

Reload systemd and restart the Ollama service:

sudo systemctl daemon-reload sudo systemctl restart ollama

Verify the changes:

sudo netstat -tuln | grep 11434

Output should now show:

tcp6 0 0 :::11434 :::* LISTEN

This configuration change allows the Ollama service to listen on all network interfaces, resolving the connection issue.

cw4219 avatar May 28 '24 13:05 cw4219

Output:

amazing! thank you, this is exactly what I needed!

homegrownhrbs avatar May 30 '24 21:05 homegrownhrbs

As your IP adress show, the RAGFlow and Ollama are deployed on the same machine, aren't they?

Yes they are on the same machine running debian 12, However ollama is not dockerized, maybe that is relevant?

On your debian 12 server, run ipconfig, get your ipv4 address, suppose that 10.x.x.8, then set the baseURL as: http://10.x.x.8:11434

Okay I have done the following:

  1. confirmed my local IP as 192.168.1.x
  2. navigated to /user-setting/model in the RAGFLOW UI
  3. Entered the "Model Type" as Chat - Model Name as "Ollama or Llama 3" (tried both) - Base url entered http://192.168.x.x:11434 ( the local IP of the Debian Machine running Ollama)

The Error is persistent. Fail to access model(Ollama).**ERROR**: [Errno 111] Connection refused One thing i did notice is that when i visit http://localhost:11434/ in firefox i see the message "Ollama is running" however, when i navigate to http://192.168.x.x:11434/ I get the "Firefox can’t establish a connection to the server" Does this makes me believe that i need to configure ollama to run on my local network somehow?

Use http://host.docker.internal:11434

Wow! Worked like a charm!!! Thank you.

cse-repon avatar Aug 12 '24 07:08 cse-repon

I miss the same issue,and solved it by using http://host.docker.internal:11434/ , while Ollama and RAGFlow run on the same machine and just RAGFlow is in Docker.

Aniwine avatar Sep 25 '24 02:09 Aniwine

Try this IP: 1722.17.0.1, and check out whether the proxy is on or not. Sometimes, http proxy will block the connections.

KevinHuSh avatar Sep 26 '24 01:09 KevinHuSh