dify icon indicating copy to clipboard operation
dify copied to clipboard

Agent node invoke llm model failed

Open mickhan opened this issue 9 months ago • 1 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

init agent node in chatflow with official agent plugin and builtin tool (current time)

✔️ Expected Behavior

agent can answer questions by using tools

❌ Actual Behavior

error returned Run failed: Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"Exception","message":"invoke llm model failed: request failed with status code: 404 and respond with: {"code": "not_found", "message": "The requested URL was not found on the server. If you entered the URL manually please check your spelling and try again.", "status": 404}\n"}

Image

mickhan avatar Mar 26 '25 10:03 mickhan

@mickhan Hello there! I'm here to help you with bug reports, answer questions, and assist you in becoming a contributor. Feel free to ask for assistance while waiting for a human maintainer.

The 404 error you're encountering when invoking the LLM model in the agent node is likely due to an incorrect or missing configuration in your setup. Here are some steps to help resolve this issue:

  1. Check API Credentials: Ensure that the API credentials used for the LLM model are correct and valid. Incorrect credentials can lead to a 404 error if the system cannot establish a valid connection to the service [1].

  2. Verify Model Access: Make sure that the model you are trying to use is available and that you have access to it. Some models may return a 404 error if they do not exist or if you do not have the necessary permissions, even with a valid subscription [2].

  3. Update Configuration: If you are using Dify version 1.0.0 or later, ensure that your .env file is correctly configured. Specifically, restore the PLUGIN_DIFY_INNER_API_KEY to its default value as specified in .env.example and restart your Docker containers for the changes to take effect [3].

  4. Environment Parameters: For Dify 1.0.0, add the following environment parameters for langgenius/dify-api:

    PLUGIN_REMOTE_INSTALL_HOST: ${EXPOSE_PLUGIN_DEBUGGING_HOST:-localhost}
    PLUGIN_REMOTE_INSTALL_PORT: ${EXPOSE_PLUGIN_DEBUGGING_PORT:-5003}
    

    And for langgenius/dify-plugin-daemon, ensure it depends on the database:

    depends_on:
      - db
    

    These configurations help ensure that the plugin can communicate correctly with the server [4].

  5. Model Name Verification: If you are using an Ollama model, verify that the model name is correct and exists in Ollama's repository. Using a non-existent model name will result in a 404 error [5].

By following these steps, you should be able to resolve the 404 error and successfully invoke the LLM model in your agent node.

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 26 '25 10:03 dosubot[bot]

The error is caused by dify-api missing the environment variable INNER_API_KEY_FOR_PLUGIN. The relevant code is here https://github.com/langgenius/dify/blob/main/api/controllers/inner_api/wraps.py#L73 . I think Dify should provide clearer documentation and error messages.

mickhan avatar Mar 27 '25 05:03 mickhan

Image

junjiem avatar Mar 27 '25 06:03 junjiem