Reference Models Centrally, set OpenAI Model to gpt-5-nano and Updated Spec Defaults
Feat: Add new conversation components and update model defaults
This pull request introduces documentation for two new conversation components (GoogleAI and Ollama) and updates the default LLM models for several existing components to newer versions.
A key change is that the default model for each conversation component can now be configured at runtime via a dedicated environment variable.
Summary of Changes:
-
New Conversation Components:
- Added documentation for the
conversation.googleaicomponent, which defaults to thegemini-2.5-flash-litemodel. - Added documentation for the
conversation.ollamacomponent, which defaults to thellama3.2:latestmodel.
- Added documentation for the
-
Default Model Updates:
- Anthropic: The default model has been updated from
claude-3-5-sonnet-20240620toclaude-sonnet-4-20250514. - Hugging Face: The default model has been updated from
meta-llama/Meta-Llama-3-8Btodeepseek-ai/DeepSeek-R1-Distill-Qwen-32B. - OpenAI: The default model has been updated from
gpt-4-turbotogpt-5-nano. - Mistral: The default model remains
open-mistral-7b.
- Anthropic: The default model has been updated from
-
Configuration via Environment Variables:
- Each conversation component's default model can now be overridden using a specific environment variable (e.g.,
OPENAI_MODEL,GOOGLEAI_MODEL, etc.). - The central environment variable reference page has been updated to include documentation for all new model environment variables:
ANTHROPIC_MODELGOOGLEAI_MODELHUGGINGFACE_MODELMISTRAL_MODELOLLAMA_MODELOPENAI_MODEL
- Each conversation component's default model can now be overridden using a specific environment variable (e.g.,
@giterinhub - Is there a corresponding code issue tracking this that you are aware of?
Created https://github.com/dapr/components-contrib/pull/3792
@giterinhub - Is there a corresponding code issue tracking this that you are aware of?
No issue I'm aware of, just created the PR. Thanks for the good time at the Boom Battle Mark!
@giterinhub - Can you review this PR now the code is in. In particular can you address @sicoyle @bibryam - Please review
- How is the AZURE_OPENAI_MODEL env used and by which component? How does this relate to the apiType field here https://v1-16.docs.dapr.io/reference/components-reference/supported-conversation/openai/ used to set Azure usage?
- Why have you included the OLLAMA model, there is an existing MD file, I presume this is not needed or update the existing one?
- Can you update the https://v1-16.docs.dapr.io/reference/environment/ list with the list of these 7 env variables with a description of each one.
- Should there be an env var for AWS Bedrock and if not why?
Thanks @sicoyle !
@giterinhub any chance you can correct the DCO step on the PR? If you click in on the build step it has the cmds you can run to make it clean and green :)
Need to wait on this PR from main to branch https://github.com/dapr/components-contrib/pull/4029
@msfussell
- How is the AZURE_OPENAI_MODEL env used and by which component? How does this relate to the apiType field here https://v1-16.docs.dapr.io/reference/components-reference/supported-conversation/openai/ used to set Azure usage?
When apiType = "azure": Uses AZURE_OPENAI_MODEL environment variable (defaults to "gpt-4.1-nano") When apiType is not "azure": Uses OPENAI_MODEL environment variable (defaults to "gpt-5-nano")
- Why have you included the OLLAMA model, there is an existing MD file, I presume this is not needed or update the existing one?
I don't see additional MD file, maybe it was removed.
- Can you update the https://v1-16.docs.dapr.io/reference/environment/ list with the list of these 7 env variables with a description of each one. Seems that is done
- Should there be an env var for AWS Bedrock and if not why?
IMO Bedrock should follow the same pattern and support env variable for model override.
Apart from these, this is a great PR (we need a similar PR for API KEY separately.
Mentioned already that I like to have AZURE_OPENAI_MODEL and OPENAI_MODEL separately @bibryam I had to remove the entry for AZURE_OPENAI_MODEL as Sam wanted it gone ;)
Mentioned already that I like to have AZURE_OPENAI_MODEL and OPENAI_MODEL separately @bibryam I had to remove the entry for AZURE_OPENAI_MODEL as Sam wanted it gone ;)
AZURE_OPENAI_MODEL is not a thing... all components that use the conversation.openai type will just use the env var of OPENAI_MODEL if they choose to use the env var to set their model. There are several components, not just azure that are openai compatible, so we don't want a ton of env vars for each of them. Since they all can just use the openai component, then they can all use the openai env var. Examples include: Minimax, Qwen, Ernie, Azure, etc.
Editting to add a few clarifications:
There is logic under the hood that checks if apiType is set to azure. In that case, the appropriate Azure model is chosen instead of defaulting to the generic OpenAI model.
@bibryam Please create a separate issue if you’d like the API_KEY to also be set via an env var. That said, should all metadata fields be configurable this way? It feels like a lot—so where do we draw the line?
Looking back at the contrib PR, it appears we missed Bedrock. This shouldn’t block the current PR and can be added later, but for now, it will be missing.