"404 Resource Not Found" when using Azure OpenAI model endpoint
I run chat-ui with the chat-ui-db docker image. I would like to connect it to my Azure OpenAI API endpoint.
I have setup the env.local file as stated in your docs and binded it with the docker container:
MODELS=`[{
"id": "gpt-4-1106-preview",
"name": "gpt-4-1106-preview",
"displayName": "gpt-4-1106-preview",
"parameters": {
"temperature": 0.5,
"max_new_tokens": 4096,
},
"endpoints": [
{
"type": "openai",
"baseURL": "https://{resource-name}.openai.azure.com/openai/deployments/{deployment-id}/chat/completions",
"defaultHeaders": {
"api-key": "{api-key}"
},
"defaultQuery": {
"api-version": "{api-version}"
}
}
]
}]`
When sending a message in chat-ui, I get a message 404 Resource Not Found on the top right of the interface.
When I manually send an HTTP request to the Azure OpenAI API endpoint with the same parameters, I get a valid response.
How can I solve this?
@gqoew There's an issue with this in the repo. I have fixed it on my local instance. Please refer here: https://github.com/huggingface/chat-ui/pull/1077
I'm a bit confused now.
If I understand well, @adhishthite you published a PR 2 months ago with a fix that was solving this. @nsarrazin modified your original fix and merged the PR in the main repo.
Despite this, chat-ui is still not working with Azure OpenAI API with the current docker build and provided MODELS= documentation.
@nsarrazin did you observe the same 404 issue?