[Question]: Fail to access model(deepseek-r1:14b).**ERROR**: [Errno 111] Connection refused
Describe your problem
The ollama service has been enabled and can be used normally in Lobechat. The model name is correct, but there is a error : Fail to access model(deepseek-r1:14b).ERROR: [Errno 111] Connection refused
Try: 172.17.0.1
https://blog.csdn.net/chengxuquan/article/details/142449545 我根据这篇文章修改成可以允许外部访问 就可以连接成功了(但是聊天功能好像不能用)
I also encounter this issue. Is there any solutions?
I also encounter this issue. Is there any solutions?
Did you install the full version? I opened ollama's external access before, and I can add it to the website normally, but I can't use LLM normally. Later, I installed the full version of the web, which can be added and used normally. My method can be used for your reference.
I also encounter this issue. Is there any solutions?
Did you install the full version? I opened ollama's external access before, and I can add it to the website normally, but I can't use LLM normally. Later, I installed the full version of the web, which can be added and used normally. My method can be used for your reference.
I also solve this problem via adding environment variable(0.0.0.0:11434)
URL:http://host.docker.internal:11434
URL:http://host.docker.internal:11434
this works
This is a big problem because other agents can easily automatically recognize local ollama, but ragflow is so cumbersome
亲身体验成功:windows环境,早前安装好的ollama,开放防火墙[11434]端口,查看宿主机ip比如 192.168.1.103,进入 ragflow容器内尝试 telnet 192.168.1.103 11434,可以联通就不会再有问题