The Agent that invokes the local model cannot tool calling
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
1.2.0
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
With the silicon-based flow API interface, this problem does not occur.
✔️ Expected Behavior
No response
❌ Actual Behavior
The issue you're experiencing with the Agent invoking the local model tool in Dify version 1.2.0 when self-hosted using Docker might be related to the way localhost is handled within Docker containers. In Docker, localhost refers to the container itself, which can cause issues when trying to access services running on the host machine.
To resolve this, you should replace localhost with host.docker.internal to access the Docker host. This change allows the service to connect correctly to the host machine's services. Here's how you can do it:
http://host.docker.internal:11434
Additionally, if you are running Ollama on macOS, you may need to set environment variables using launchctl:
launchctl setenv OLLAMA_HOST "0.0.0.0"
For Linux, set the environment variable in the systemd service:
[Service]
Environment="OLLAMA_HOST=0.0.0.0"
For Windows, edit or create new environment variables for your user account. These steps ensure that the service connects correctly [1][2].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
Your model should support one of these features:
I have already done this configuration in the model interface.
I have already done this configuration in the model interface.
@doit-5618
check the api: /console/api/workspaces/current/models/model-types/llm
I have already done this configuration in the model interface.
check the api: /console/api/workspaces/current/models/model-types/llm
I have already done this configuration in the model interface.