Add model provider Ollama fail
Self Checks
- [X] This is only for bug report, if you would like to ask a quesion, please head to Discussions.
- [X] I have searched for existing issues search for existing issues, including closed ones.
- [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [X] Pleas do not modify this template :) and fill in all the required fields.
Dify version
0.6.6
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I have followed the instruction on https://docs.dify.ai/tutorials/model-configuration/ollama#if-you-are-using-docker-to-deploy-dify-and-ollama-you-may-encounter-the-following-error
Then I start ollama serve works fine.
The following image error occurred when the model was finally add
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
To address the issue where the "ollama_lambda" is not found in the PATH during your attempt to add the Ollama model to your self-hosted Dify setup, follow these troubleshooting steps:
-
Check Docker Networking: Ensure that the Ollama service is accessible from the Dify container. This may involve configuring Docker networking options to allow containers to communicate.
-
Adjust the Ollama URL: If both services are on the same host but in separate containers, use
host.docker.internalinstead oflocalhostto refer to the host from within a container. For example, set the Ollama URL in Dify tohttp://host.docker.internal:11434. -
Verify Ollama Service Configuration: Make sure the Ollama service is running and correctly configured to accept connections. This includes setting the
OLLAMA_BASE_URLenvironment variable to the correct address where the Ollama service is accessible.
Ensure all environment variable settings and Docker network configurations are correctly applied. Also, check for any firewall rules or network policies that might be blocking the connection between Dify and Ollama.
Details
Sources
- en/tutorials/model-configuration/ollama.md
- api/tests/integration_tests/model_runtime/ollama/test_llm.py
- zh_CN/guides/model-configuration/ollama.md
- api/core/model_runtime/model_providers/ollama/ollama.yaml
- api/app.py
- en/user-guide/launching-dify-apps/developing-with-apis/README.md
About Dosu This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
To continue the conversation, mention @dosu.
I have it working!! I can help you to fix. write me to discord @santiblanko