dify icon indicating copy to clipboard operation
dify copied to clipboard

Got error while invoke agent in workflow

Open dickens88 opened this issue 9 months ago • 9 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

1.1.2

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Hi Team,

I got an error while trying to call agent in a workflow. I found several history issues tickets and it seems someone linked a fixing:

https://github.com/langgenius/dify-plugin-sdks/pull/45

I have updated to v1.1.2, and i can see that works fine if i chose gpt4o. But what if i switch back to my local Ollama models it still got the same error.

Image

Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"ValidationError","message":"1 validation error for FunctionCallingParams\nmodel.entity\n Input should be a valid dictionary or instance of AIModelEntity [type=model_type, input_value=None, input_type=NoneType]\n For further information visit https://errors.pydantic.dev/2.8/v/model_type"}

✔️ Expected Behavior

No response

❌ Actual Behavior

No response

dickens88 avatar Mar 22 '25 18:03 dickens88

I found similar threads that might be relevant to your issue:

  1. Error reported when running an agent in a workflow [1].
  2. Agent node in workflow run status fails [2].

Additionally, there is an open issue related to a similar error message: An error message was displayed when an agent body was created in dify using ollama's local model API 400 [3]. The suggested solution for the 'Model schema not found' error involves enabling the 'Support function call' option in your Ollama model configuration. If you're still facing a 400 error, it might be due to model compatibility issues with the tools you're using. Ensure that the models are configured to support necessary functionalities, such as function calls and tools, if applicable.

For the specific error you're encountering, it seems related to the validation of the model entity input type. You might need to modify the AgentModelConfig class to allow the entity to be optional, as follows:

class AgentModelConfig(LLMModelConfig):
    entity: Optional[AIModelEntity] = Field(default=None)

After making this change, restart the plugin daemon container using:

docker restart docker-plugin_daemon-1

This adjustment should help resolve the PluginInvokeError related to the validation error [4].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Mar 22 '25 18:03 dosubot[bot]

I'd like to update some information from my testing. I updated Dify Agent policy plugin cot_agent to v0.0.10 and the error disappear this time.

But the output of agent is always empty if it invoked functions there. If there is no function calling the output is normal...

Image

dickens88 avatar Mar 23 '25 14:03 dickens88

This is because function calling for some models (such as those provided by Ollama) requires setting streaming returns to False, while currently they are all being processed with stream set to true. This issue has been fixed, and we will be updating the plugin version in the next few days. At that time, you just need to update your plugin version and it will be ok.

Nov1c444 avatar Mar 24 '25 01:03 Nov1c444

@Nov1c444 there is another issue, I can't change the default param ablout llm in agent node:

Image

Image

also ref to : https://github.com/langgenius/dify-official-plugins/issues/533

use llm default value, every thing goes well.

lordk911 avatar Mar 24 '25 07:03 lordk911

Got it. I'll fix it ASAP.

Nov1c444 avatar Mar 24 '25 07:03 Nov1c444

Got it. I'll fix it ASAP.

For Xinference there is some code to modify, if it could help.

https://github.com/langgenius/dify/issues/14904#issuecomment-2724775300

@Nov1c444

lordk911 avatar Mar 24 '25 07:03 lordk911

Please update the plugin version to 0.0.12 and try it. If there are no problems, I will close the issue.

Nov1c444 avatar Mar 27 '25 05:03 Nov1c444

Please update the plugin version to 0.0.12 and try it. If there are no problems, I will close the issue. @Nov1c444

I've test functioncall , I can change the default param ablout llm in agent node now. There is another question, the "maximum iterations" can not set less than 2 ? when I set to the value to 1, the agent tool list will not send to llm

lordk911 avatar Mar 27 '25 08:03 lordk911

Please update the plugin version to 0.0.12 and try it. If there are no problems, I will close the issue.

Great, it works for me. Thank you for your efforts.

dickens88 avatar Mar 27 '25 12:03 dickens88