[Bug]: Unable to use Azure OpenAI
Current Behavior
I've configured wave ai according to the docs, I've tried GPT4o 4.1 and 4o-mini, they either give a resource not found error or this: error calling openai API: error, status code: 400, status: 400 Bad Request, message: Invalid value: 'error'. Supported values are: 'system', 'assistant', 'user', 'function', 'tool', and 'developer'.
I have other apps using this endpoint so I know it's working.
Expected Behavior
For it to work
Steps To Reproduce
In the wave ai block choose add ai preset, enter config and click save. Test and receive an error.
{ "ai@azure-gpt4": { "display:name": "Azure GPT-4", "display:order": 1, "ai:*": true, "ai:apitype": "azure", "ai:baseurl": "https://xxx.openai.azure.com", "ai:model": "gpt-4.1", "ai:apitoken": "xxxx" } }
Wave Version
0.11.3 (202505051805)
Platform
Windows
OS Version/Distribution
11
Architecture
x64
Anything else?
No response
Questionnaire
- [ ] I'm interested in fixing this myself but don't know where to start
- [ ] I would like to fix and I have a solution
- [ ] I don't have time to fix this right now, but maybe later
I have litellm set up to proxy to Azure for things that don't support it but do support the standard openai api. I see the same error in the client and in the server log:
16:52:33 - LiteLLM Proxy:DEBUG: common_request_processing.py:496 - An error occurred: litellm.BadRequestError: AzureException BadRequestError - Invalid value: 'error'. Supported values are: 'system', 'assistant', 'user', 'function', 'tool', and 'developer'