Model providers for LLM models
Self Checks
- [x] I have read the Contributing Guide and Language Policy.
- [x] I have searched for existing issues search for existing issues, including closed ones.
- [x] I confirm that I am using English to submit this report, otherwise it will be closed.
- [x] Please do not modify this template :) and fill in all the required fields.
1. Is this request related to a challenge you're experiencing? Tell me about your story.
During the last week, OpenAI had a degradation. To resolve, I changed to AzureAI
It would have been faster if the LLM model system was setup as
LLM model -> List of providers (one is chosen as default) In the LLM nodes you would be able to chose a model and optionally override the chosen provider (use default otherwise)
This would have allowed me to choose as default provider for the models I use from OpenAI to AzureAI Instead, I had to go through every node of every workflow and manually choose the same identical model, from a different provider
2. Additional context or comments
The system I propose, scales a lot better than manually editing every workflow's node
3. Can you help us with this feature?
- [ ] I am interested in contributing to this feature.
I also created a similar issue few months ago and the team said they need some time. 🤔
https://github.com/langgenius/dify/issues/21189
Hi, @DavideDelbianco. I'm Dosu, and I'm helping the Dify team manage their backlog and am marking this issue as stale.
Issue Summary:
- You proposed a model provider system for LLM models to enable setting a default provider with optional per-node overrides.
- This feature aims to simplify provider switching and improve workflow management.
- Another user mentioned a similar request was raised months ago, with the team noting it would take some time to address.
- There has been ongoing interest but no concrete progress or resolution yet.
Next Steps:
- Please let me know if this issue is still relevant to the latest version of Dify by commenting here.
- If I don’t hear back within 15 days, this issue will be automatically closed.
Thanks for your understanding and contribution!