Agent node invoke llm model failed
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
1.1.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
init agent node in chatflow with official agent plugin and builtin tool (current time)
✔️ Expected Behavior
agent can answer questions by using tools
❌ Actual Behavior
error returned
Run failed: Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"Exception","message":"invoke llm model failed: request failed with status code: 404 and respond with: {"code": "not_found", "message": "The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.", "status": 404}\n"}
@mickhan Hello there! I'm here to help you with bug reports, answer questions, and assist you in becoming a contributor. Feel free to ask for assistance while waiting for a human maintainer.
The 404 error you're encountering when invoking the LLM model in the agent node is likely due to an incorrect or missing configuration in your setup. Here are some steps to help resolve this issue:
-
Check API Credentials: Ensure that the API credentials used for the LLM model are correct and valid. Incorrect credentials can lead to a 404 error if the system cannot establish a valid connection to the service [1].
-
Verify Model Access: Make sure that the model you are trying to use is available and that you have access to it. Some models may return a 404 error if they do not exist or if you do not have the necessary permissions, even with a valid subscription [2].
-
Update Configuration: If you are using Dify version 1.0.0 or later, ensure that your
.envfile is correctly configured. Specifically, restore thePLUGIN_DIFY_INNER_API_KEYto its default value as specified in.env.exampleand restart your Docker containers for the changes to take effect [3]. -
Environment Parameters: For Dify 1.0.0, add the following environment parameters for
langgenius/dify-api:PLUGIN_REMOTE_INSTALL_HOST: ${EXPOSE_PLUGIN_DEBUGGING_HOST:-localhost} PLUGIN_REMOTE_INSTALL_PORT: ${EXPOSE_PLUGIN_DEBUGGING_PORT:-5003}And for
langgenius/dify-plugin-daemon, ensure it depends on the database:depends_on: - dbThese configurations help ensure that the plugin can communicate correctly with the server [4].
-
Model Name Verification: If you are using an Ollama model, verify that the model name is correct and exists in Ollama's repository. Using a non-existent model name will result in a 404 error [5].
By following these steps, you should be able to resolve the 404 error and successfully invoke the LLM model in your agent node.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
The error is caused by dify-api missing the environment variable INNER_API_KEY_FOR_PLUGIN. The relevant code is here https://github.com/langgenius/dify/blob/main/api/controllers/inner_api/wraps.py#L73 . I think Dify should provide clearer documentation and error messages.