[FEAT]: Allow Azure OpenAI defined as Workspace LLM Provider
How are you running AnythingLLM?
All versions
What happened?
Azure OpenAI can only be setup in system default LLM Preference.
But I want to specify it for a single workspace and realize that it is not available in the drop down list.
Issue found in both Windows Desktop v1.5.0 and Docker in Linux.
Are there known steps to reproduce?
- Create a Workspace
- Go to Chat Settings - Workspace LLM Provider
- There is no option for 'Azure OpenAI'
AnythingLLM macOS version v1.5.0 also don't have an option 'Azure OpenAI' on the LLM provider dropdown on Workspace settings
This is actually intentional for Workspace LLMs at this time. https://github.com/Mintplex-Labs/anything-llm/blob/bf165f2ab25b873581edcacac934ebff3f429d8d/frontend/src/pages/WorkspaceSettings/ChatSettings/WorkspaceLLMSelection/index.jsx#L8
This is actually intentional for Workspace LLMs at this time.
https://github.com/Mintplex-Labs/anything-llm/blob/bf165f2ab25b873581edcacac934ebff3f429d8d/frontend/src/pages/WorkspaceSettings/ChatSettings/WorkspaceLLMSelection/index.jsx#L8
but what's the reasoning?
@spiveym It was because we use the process.env for keeping track of the model preference and the dropdown for the model selector is, well, a dropdown - so we can't use /models to enum the available models.
We need a combobox component that we can use for free-form + available models so we can support providers with many models at one time specifically those who dont support /models