Plugin not found when i use agent
Self Checks
- [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
- [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
- [x] Please do not modify this template :) and fill in all the required fields.
Dify version
1.0.0
Cloud or Self Hosted
Cloud
Steps to reproduce
When i run a workflow using agent, fail. The error show "plugin not found" on agent node, but wikipedia plugin already installed.
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
Hello, @avasquezkudaw! I'm here to help you with any bugs, questions, or contributions you might have. Let's tackle this issue together!
The "Plugin not found" error on the agent node, despite having the Wikipedia plugin installed, might occur because the plugin needs to be reinstalled. This issue can happen if the plugin is installed from GitHub and is not properly recognized or linked in the system. You should try going to the Plugins section in Dify and reinstalling the plugin to resolve this issue [1].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
@dosu I have reinstalled all plugins and the error has changed. Now I get the following error:
Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"ValidationError","message":"1 validation error for FunctionCallingParams\nmodel.entity\n Input should be a valid dictionary or instance of AIModelEntity [type=model_type, input_value=None, input_type=NoneType]\n For further information visit https://errors.pydantic.dev/2.8/v/model_type"}
The error you're encountering indicates that the model.entity field in the FunctionCallingParams is not being correctly populated with a valid dictionary or instance of AIModelEntity. This is likely causing the ValidationError.
To resolve this issue, ensure that the model field in your FunctionCallingParams is correctly set up. It should be an instance of AIModelEntity or a valid dictionary that matches the expected structure. Here's a brief overview of what the AIModelEntity should include:
model: The identifier of the model (e.g., "gpt-4").model_type: The type of model, such as "llm" for language models.- Other optional fields like
label,features, andmodel_propertiescan be included as needed.
Make sure that the model field in your configuration is not null and is properly defined. You might need to check the part of your workflow where the model is being set and ensure it aligns with the expected format for AIModelEntity [1].
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
@dosu It is important to mention that the test was performed using the Azure Open AI GPT-4o model. However, when using OpenAI's GPT-4o, it works correctly.
The issue you're experiencing with the Azure OpenAI GPT-4O model, but not with OpenAI's GPT-54O, might be related to differences in how these models are configured or validated in Dify.
Ensure that the AIModelEntity for the Azure OpenAI model is correctly set up. The structure should include the model identifier, model_type, and other necessary properties. It's crucial that the credentials and model properties are correctly specified for Azure OpenAI models, as they might differ from those used for OpenAI models.
Additionally, check for any known issues or limitations with Azure OpenAI models in Dify. For instance, if the Azure model version is below 0613, it might not support certain features like function calls, which could lead to errors [1].
Review the configuration and validation steps for Azure OpenAI models to ensure compatibility with Dify's platform. If the issue persists, it might be helpful to compare the configurations of both models to identify any discrepancies.
To continue talking to Dosu, mention @dosu.
Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other
@dosu I have reinstalled all plugins and the error has changed. Now I get the following error:
Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"ValidationError","message":"1 validation error for FunctionCallingParams\nmodel.entity\n Input should be a valid dictionary or instance of AIModelEntity [type=model_type, input_value=None, input_type=NoneType]\n For further information visit https://errors.pydantic.dev/2.8/v/model_type"}
I am now getting the exact same error when make an external API call to a workflow run. It was working perfectly just a few days ago. I am using the Web Pro version as well and have installed all the necessary plugins over again.
same problem model: volcengine_maas + deepseek-v3
@dosu I have reinstalled all plugins and the error has changed. Now I get the following error:
Failed to transform agent message: PluginInvokeError: {"args":{},"error_type":"ValidationError","message":"1 validation error for FunctionCallingParams\nmodel.entity\n Input should be a valid dictionary or instance of AIModelEntity [type=model_type, input_value=None, input_type=NoneType]\n For further information visit https://errors.pydantic.dev/2.8/v/model_type"}
getting the same error when trying to use Agent as a node.
Model settings: Azure OpenAI Service Model --> gpt-4o-mini-2024-07-18
After debugging, it was determined that the error requires modifying the dependency code within the plugin.
Modify the __init__.py file in the directory volumes/plugin_daemon/cwd/langgenius/agent-0.0.9@f16916b704a20067317dbe9030f62aa28f0832021a3ef6d4ce699504074c5e13/.venv/lib64/python3.12/site-packages/dify_plugin/interfaces/agent, and set the default value of entity in the AgentModelConfig class to None.
Before modification:
class AgentModelConfig(LLMModelConfig):
entity: AIModelEntity
After modification:
class AgentModelConfig(LLMModelConfig):
entity: Optional[AIModelEntity] = Field(default=None)
To restart the docker-plugin_daemon-1 container after modifying the code, you can use the following command:
docker restart docker-plugin_daemon-1
Verification result:
Hi, I got the same issue. After upgraded to v1.1.2 the agent flow with gpt4o truely restore. But the error still existing if I use Ollama with qwen2.5 ...
The solution I found was to update the plugins one by one and I was able to fix it.