gateway
gateway copied to clipboard
Azure ai requests are failing due to changed endpoint
What Happened?
Provider - azure-ai
Example model - mistral-large
According to the new Azure AI Studio, AI inferencing models URL have changed to - https://resource-name.openai.azure.com/ but the code still refers to previous patterns
on using provider - azure-openai gives error
{"error":{"message":"azure-openai error: Backend returned unexpected response. Please contact Microsoft for help.","type":null,"param":null,"code":"InternalServerError"},"provider":"azure-openai"}
portkey log shows bad URL
url": "https://undefined.undefined.inference.ml.azure.com/score/chat/completions?api-version=2024-05-01-preview",
"method": "POST",
What Should Have Happened?
add another mode or change the url pattern to -
if (azureDeploymentType === 'serverless') {
return
https://${azureDeploymentName?.toLowerCase()}.openai.azure.com`;
}
Relevant Code Snippet
https://github.com/Portkey-AI/gateway/edit/main/src/providers/azure-ai-inference/api.ts
if (provider === GITHUB) {
return 'https://models.inference.ai.azure.com';
}
if (azureDeploymentType === 'serverless') {
return `https://${azureDeploymentName?.toLowerCase()}.${azureRegion}.models.ai.azure.com`;
}
Your Twitter/LinkedIn
https://www.linkedin.com/in/amanintech/
BTW I am happy to patch this if it seems as a good fix.
@amanintech thanks for reporting this, yes the fix looks good, you can raise it. I'm assigning the issue to you
closing as stale