dify icon indicating copy to clipboard operation
dify copied to clipboard

add ollama error

Open xuesd opened this issue 10 months ago • 9 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.0.0-beta

Cloud or Self Hosted

Self Hosted (Source)

Steps to reproduce

Image Image When adding the ollama configuration in the model supplier interface, the ollama is set to local 192.168.xx.xx: 11434, and the backend will prompt an error message

✔️ Expected Behavior

add ollama success

❌ Actual Behavior

the backend will prompt an error message

xuesd avatar Feb 25 '25 10:02 xuesd

I found a similar open issue and discussions related to the error when adding the Ollama model configuration:

  1. Error when adding ollama model - This discussion suggests upgrading Ollama to the latest version and using the new endpoint api/embed as the old one api/embedings has been deprecated. It also recommends ensuring the Ollama service is running and accessible, setting the OLLAMA_HOST environment variable to 0.0.0.0, using the correct IP address for Docker, and adjusting firewall settings to allow incoming connections on port 11434.

  2. Can't add the Ollama Models - This discussion might provide additional insights or similar experiences from other users.

  3. Help: I got an error user ollama with model llava-phi3 - Another discussion that might be relevant to your issue.

For resolving the error message when adding the Ollama configuration with the local address 192.168.xx.xx:11434, ensure that the Ollama service is properly exposed and accessible from the network. You can follow these steps:

  • Set the OLLAMA_HOST environment variable to 0.0.0.0 to bind it to all network interfaces.
  • If using Docker, replace localhost with host.docker.internal in your service configuration.
  • Ensure the network configuration allows access to the specified IP and port.

These steps should help resolve the connection error and allow you to successfully add the Ollama configuration [1][2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Feb 25 '25 10:02 dosubot[bot]

Image

Image

Image

Image

Web console error message

xuesd avatar Feb 26 '25 01:02 xuesd

Who can help me?

xuesd avatar Feb 27 '25 01:02 xuesd

Who can help me?

If Dify is deployed using Docker, consider using the local network IP address, http://host.docker.internal:11434 to access the service.

ref: https://docs.dify.ai/development/models-integration/ollama

p.s. i met the same problem last day, and solved it this way

hefeixuefei avatar Feb 28 '25 00:02 hefeixuefei

Image

Image

Image

After yesterday's update, the model was still added. Now there is an error like this in the picture, and adding the API will also fail

xuesd avatar Feb 28 '25 01:02 xuesd

Did you fix it? I'm having the same issue.

lanyur avatar Mar 06 '25 16:03 lanyur

My Ollama service and the Dify Docker are not on the same machine. I have ensured that Ollama's port 11434 is accessible from the Docker server

lanyur avatar Mar 06 '25 16:03 lanyur

same error!

luozhy88 avatar Mar 07 '25 15:03 luozhy88

Latest update: I found that after restarting Docker Compose, it started working properly. Version 1.0.0 isn't very stable.

lanyur avatar Mar 08 '25 02:03 lanyur