gateway icon indicating copy to clipboard operation
gateway copied to clipboard

Azure ai requests are failing due to changed endpoint

Open amanintech opened this issue 11 months ago • 2 comments

What Happened?

Provider - azure-ai Example model - mistral-large

According to the new Azure AI Studio, AI inferencing models URL have changed to - https://resource-name.openai.azure.com/ but the code still refers to previous patterns

on using provider - azure-openai gives error

{"error":{"message":"azure-openai error: Backend returned unexpected response. Please contact Microsoft for help.","type":null,"param":null,"code":"InternalServerError"},"provider":"azure-openai"}

portkey log shows bad URL

url": "https://undefined.undefined.inference.ml.azure.com/score/chat/completions?api-version=2024-05-01-preview",
  "method": "POST",

What Should Have Happened?

add another mode or change the url pattern to -

if (azureDeploymentType === 'serverless') {
      return 
https://${azureDeploymentName?.toLowerCase()}.openai.azure.com`;
}

Relevant Code Snippet

https://github.com/Portkey-AI/gateway/edit/main/src/providers/azure-ai-inference/api.ts

if (provider === GITHUB) {
      return 'https://models.inference.ai.azure.com';
    }
    if (azureDeploymentType === 'serverless') {
      return `https://${azureDeploymentName?.toLowerCase()}.${azureRegion}.models.ai.azure.com`;
    }

Your Twitter/LinkedIn

https://www.linkedin.com/in/amanintech/

amanintech avatar Dec 11 '24 05:12 amanintech

BTW I am happy to patch this if it seems as a good fix.

amanintech avatar Dec 11 '24 05:12 amanintech

@amanintech thanks for reporting this, yes the fix looks good, you can raise it. I'm assigning the issue to you

narengogi avatar Dec 19 '24 09:12 narengogi

closing as stale

narengogi avatar Jul 09 '25 07:07 narengogi