How to connect to Azure hosted OpenAI?
I tried to create a global openai bot with a config like this:
base_url: https://yourname.openai.azure.com/openai/deployments/gpt-4o
api_key: yourkey
text_generation:
model_id: gpt-4o
prompt: You are a brief, but helpful bot.
temperature: 1.0
max_response_tokens: 16384
max_context_tokens: 128000
but this only returns Resource not found (code: 404)
When I copy the curl sample from Azure OpenAI Studio this looks like this:
payload="{\"messages\":[{\"role\":\"system\",\"content\":[{\"type\":\"text\",\"text\":\"You are an AI assistant that helps people find information.\"}]}],\"temperature\":0.7,\"top_p\":0.95,\"max_tokens\":800}"
curl "https://yourname.openai.azure.com/openai/deployments/gpt-4o/chat/completions?api-version=2024-02-15-preview" \
-H "Content-Type: application/json" \
-H "api-key: yourkey" \
-d "$payload"
this works just fine within the container. How will I have to adapt these settings for Baibot?
When I try to set it up as OpenAI compatible I receive this error:
Failed to get response from the OpenAI-compat chat completion API: ApiError("{\"error\":{\"code\":\"404\",\"message\":\"Resource not found\"}}")
I have the same problem in the current version 1.3.2
The problem is the underlaying API calls for Azure are a little bit different and we need a different implementation for https://github.com/etkecc/openai_api_rust which i tried to create https://github.com/kitzler-walli/az-openai-api
Now we would need an adaption for the bot itself. Main problem is, that Azure OpenAI need a different endpoint (url) for every kind of call whether it's chat, image, completions, ... I currently can't decide if I should create a bot for every kind or implement a version where we could define the endpoints for every kind of call.