ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Question]: Model Ollama cannot connect

Open ginisksam opened this issue 1 year ago • 43 comments

Describe your problem

But LLM limited. Got Ollama - Mistral instance running at 127.0.0.1:11434 but cannot add Ollama as model in RagFlow. Please assist. This software is very good and flexible for document split-chunk-semantic for embedding. Many thanks

ginisksam avatar Apr 12 '24 01:04 ginisksam

Do you mean on the demo Website or locally deployed? If on the demo Website, 127.0.0.1 is a not accessible IP. Make sure the server deploying Ollama has an internet accessible IP address. If you deploy RAGFlow locally, make sure both Ollama and RAGFlow in the same LAN that can comunicate eachother. A correct Ollama IP and Port is the key.

KevinHuSh avatar Apr 12 '24 02:04 KevinHuSh

locally deployed. The error is as flws:

hint : 102 Fail to access model(mistral).ERROR: [Errno 111] Connection refused

As you know Ollama is really popular now for local machine.

OK, I got your msg. IP on same LAN is key. Will try restart Ollama as root and try OLLAMA_HOST=0.0.0.0:11434 ollama serve

Thanks

ginisksam avatar Apr 12 '24 02:04 ginisksam

Hello, I got the same problem.

hint : 102 Fail to access model(qwen:14b).ERROR: [Errno 111] Connection refused

I modified Environment="OLLAMA_HOST=0.0.0.0" And when I try 0.0.0.0:11434 on browser, it showed ollama is running.

I couldn't add model on the web. Could you help me, thx.

image

image

image

image

shaoxinghua0623 avatar Apr 12 '24 03:04 shaoxinghua0623

感谢大神~~

尝试了下,ollama链接不成功,但是打开:否是否支持 Vision后,可以添加成功。

但是,在chat选项里面,没列初ollama的选项,应该是假象。 11

聊天配置,没得选刚才添加的ollama模型 微信截图_20240412125738

希望大神完善下。

mjiulee avatar Apr 12 '24 04:04 mjiulee

Yes, I got the same promble with you. @mjiulee

shaoxinghua0623 avatar Apr 12 '24 05:04 shaoxinghua0623

@shaoxinghua0623

我又尝试了遍,神奇的可以了。

就是把url换成你装ollama的服务器的ip地址就行,例如:http://192.168.0.100:11434

然后就ok了~

mjiulee avatar Apr 12 '24 05:04 mjiulee

@mjiulee

真的,谢谢兄弟!

shaoxinghua0623 avatar Apr 12 '24 05:04 shaoxinghua0623

@shaoxinghua0623 I have the same issue on Ubuntu 22.04 . Did the above resolve your issue? If yes can you please help me how to find the appropriate IP for Ollama url?

matheospower avatar Apr 12 '24 08:04 matheospower

@matheospower you can use the command ifconfig on the terminal to find the IP of Ubuntu. And the ollama base url is http://IP of your Ubuntu:11434. IP of your Ubuntu is not 0.0.0.0 or 127.0.0.1

shaoxinghua0623 avatar Apr 12 '24 12:04 shaoxinghua0623

@matheospower you can use the command ifconfig on the terminal to find the IP of Ubuntu. And the Ollama base url is http://IP of your Ubuntu:11434. IP of your Ubuntu is not 0.0.0.0 or 127.0.0.1

Thank you for the answer! Unfortunately, this did not resolve my problem. Not sure If I need to open a new issue but I will post it here.

My problem is that I get stuck in the pop-up to add an Ollama model. I tested the Ollama service (running), from outside and inside the ragflow-server with curl, and seems fine and can be reached. After setting the url in the pop-up and clicking ok, it is loading for some time and then gives me a connection time-out. Also I cannot see anything in the docker logs -f ragflow-server or the rag flow-logs directory.

If anyone had a similar issue or can give a hint on how to troubleshoot please let me know!

matheospower avatar Apr 12 '24 13:04 matheospower

Hi , same issue there .... I did test using http://host.docker.internal:11434/ as a base url ( that's probably the way to go specially in a docker deployment model) but I got an error "Hint 102 : Fail to access model(/mistral).ERROR: [Errno -2] Name or service not known " ...

fredrousseau avatar Apr 12 '24 23:04 fredrousseau

Found a way to solve that issue , I had to change my ollama settings to "Environment="OLLAMA_HOST=PRIVATEIP" to get it exposed ... it looks like if you listen on the loopback (127.0.0.1) ragflow is not able to reach it

fredrousseau avatar Apr 12 '24 23:04 fredrousseau

Solve this Problem:

  1. Make sure Ollama is OK.
  2. Config as follow: 截屏2024-04-13 12 26 32

ShawnHoo7256 avatar Apr 13 '24 04:04 ShawnHoo7256

Found a way to solve that issue , I had to change my ollama settings to "Environment="OLLAMA_HOST=PRIVATEIP" to get it exposed ... it looks like if you listen on the loopback (127.0.0.1) ragflow is not able to reach it

Kool. Will give it a try.

FINDINGS: Just discovered that my existing ollama working well with langchain is not at root level.

If I edit my ollama.service file and set Environment="OLLAMA_HOST=PRIVATEIP" and systemctl start ollama.service - In browser PRIVATEIP:11434 => ollama is running. Fine.

But in terminal - ollama list - all the models are missing!!! Case in point - Can ollama resides in root and user - and serve at root or user level separately at any one time? Will not affect each other? OS: Linux Mint 21.3 (newbie)

ginisksam avatar Apr 13 '24 07:04 ginisksam

Found a way to solve that issue , I had to change my ollama settings to "Environment="OLLAMA_HOST=PRIVATEIP" to get it exposed ... it looks like if you listen on the loopback (127.0.0.1) ragflow is not able to reach it

This didnt work for me either

OmegAshEnr01n avatar Apr 16 '24 10:04 OmegAshEnr01n

If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:

  • localhost
  • 127.0.0.1
  • 0.0.0.0

Then in docker you need to replace that localhost part with host.docker.internal. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434/ you should put http://host.docker.internal:11434 into the connection URL in OLLAMA setup.

Important On linux http://host.docker.internal:xxxx does not work.

hiwujie avatar Apr 19 '24 06:04 hiwujie

我可以解决这个问题! 很简单! 在基础URL这一栏里填写:例如:http://192.168.0.100:11434/v1 注意:一定要加v1,我猜测这是ragflow模仿OpenAI的调用格式,而且在ollama的官方调用OpenAI格式的服务时,也是加上了v1! 这样就可以添加模型了!

ganchun1130 avatar Apr 24 '24 03:04 ganchun1130

If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:

  • localhost
  • 127.0.0.1
  • 0.0.0.0

Then in docker you need to replace that localhost part with host.docker.internal. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434/ you should put http://host.docker.internal:11434 into the connection URL in OLLAMA setup.

Important On linux http://host.docker.internal:xxxx does not work.

This suggestion is so crucial that it needs to be inserted in the /docs/ollama.md imho.

MatrixWise avatar Apr 28 '24 19:04 MatrixWise

How can I make it work on Linux lol.

OmegAshEnr01n avatar Apr 30 '24 01:04 OmegAshEnr01n

If you are in docker and cannot connect to a service running on your host machine running on a local interface or loopback:

  • localhost
  • 127.0.0.1
  • 0.0.0.0

Then in docker you need to replace that localhost part with host.docker.internal. For example, if running Ollama on the host machine, bound to http://127.0.0.1:11434/ you should put http://host.docker.internal:11434 into the connection URL in OLLAMA setup.

Important On linux http://host.docker.internal:xxxx does not work.

host.docker.internal can work on Linux if you modify the docker-compose.yml by adding extra_hosts like this:

    extra_hosts:
      - "host.docker.internal:host-gateway"

Once host-gateway is mapped to host.docker.internal, you should be able to refer to a ollama instance running on the same host as ragflow (but not within the docker-compose) by referring to is as http://host.docker.internal:11434/

gaspardpetit avatar May 02 '24 14:05 gaspardpetit

您的意思是在演示网站上还是在本地部署?如果在演示网站上,127.0.0.1 是无法访问的 IP。确保部署 Ollama 的服务器具有可通过 Internet 访问的 IP 地址。如果在本地部署 RAGFlow,请确保 Ollama 和 RAGFlow 位于可以相互通信的同一 LAN。正确的 Ollama IP 和端口是关键。

Can't you use 'http://localhost:11434' to connect to ollama on the demo? You can only use 'http://localhost:11434' to connect to ollama after local deployment, is that right? If I want to add ollama3 to the demo, what is the best way?

tslyellow avatar Jun 27 '24 06:06 tslyellow

http://host.docker.internal:11434

Hi, I ragflow via docker, ollama is in win local, I set the url to ' http://host.docker.internal:11434 ' and still get an error, do you know what's going on? If you can, can you help me out? 1719489963889

tslyellow avatar Jun 27 '24 12:06 tslyellow

@tslyellow On Windows when running a Linux container in WSL, if you want to reach a port on the Windows host, you need to add --add-host=host.docker.internal:host-gateway to your docker command line, and target host.docker.internal (like you are doing above). If you are launching the container from docker compose, then see my post above about using extra_hosts.

If it still does not work, it may be that ollama, it bound to 127.0.0.0 by default, so the port may not be available outside of your loopback device. To instruct ollama to listen to all network devices (including the docker virtual network), you need to set the OLLAMA_HOST environment variable to 0.0.0.0. Note that this will also expose ollama to incoming traffic from outside your PC, so you may want to ensure that you have proper firewall settings in place. Alternatively, you may chose to bind ollama to your WLS IP, which can be found by running ipconfig.

gaspardpetit avatar Jun 27 '24 13:06 gaspardpetit

ifconfig

你解决了吗?

zzlTim avatar Jul 24 '24 09:07 zzlTim

我怎样才能让它在 Linux 上运行,哈哈。

你解决了吗

zzlTim avatar Jul 24 '24 09:07 zzlTim

@shaoxinghua0623

我又尝试了遍,神奇的可以了。

就是把url换成你装ollama的服务器的ip地址就行,例如:http://192.168.0.100:11434

然后就ok了~

你好,我已经把url换成装ollama的服务器ip地址了,但出现了如图所示的问题,请问有何见解吗?非常感谢! connection issue

Stella12121 avatar Jul 25 '24 03:07 Stella12121

http://host.docker.internal:11434

嗨,我通过 docker ragflow,ollama 在 win 本地,我将 url 设置为“http://host.docker.internal:11434”但仍然出现错误,你知道发生了什么吗?如果可以,你能帮我吗? 1719489963889

你解决了吗

zzlTim avatar Jul 25 '24 07:07 zzlTim

完全相同的步骤,甚至是新系统,新机器。感觉就是他们的框架有问题

zzlTim avatar Jul 25 '24 14:07 zzlTim

@shaoxinghua0623

我又尝试了遍,神奇的可以了。

就是把url换成你装ollama的服务器的ip地址就行,例如:http://192.168.0.100:11434

然后就ok了~

几个月后神奇的又不可以了:(

yangboz avatar Aug 05 '24 03:08 yangboz

Same problem here...

cidxb avatar Aug 20 '24 08:08 cidxb