ai icon indicating copy to clipboard operation
ai copied to clipboard

Support Azure AI Model Inference API provider

Open wong2 opened this issue 1 year ago • 1 comments

Feature Description

Today I deployed the Llama 3.1 model using Azure AI Studio and got it ready to use through the AI SDK. I originally thought its API was compatible with OpenAI, but it didn't work out when I tried. From the documentation, I found out it uses the Azure AI Model Inference API, so does that mean we need a separate provider?

wong2 avatar Jul 24 '24 05:07 wong2

@wong2 yes this might require a new provider, like azure-ai-studio

lgrammel avatar Jul 24 '24 07:07 lgrammel

This need has come up again because we want to use DeepSeek R1 on Azure.

wong2 avatar Jan 30 '25 04:01 wong2

Agreed we would really benefit from a non-OpenAI Azure provider so we can use Azure DeepSeek!

DakotaWray2 avatar Jan 31 '25 20:01 DakotaWray2

+1 for an inference api provider!

DrPye avatar Feb 07 '25 11:02 DrPye

yes please!

miguelvictor avatar Feb 10 '25 10:02 miguelvictor

I made the following code and it works for me.

import { createOpenAICompatible } from '@ai-sdk/openai-compatible';

const API_VERSION = process.env.AZURE_API_VERSION
const AZURE_API_KEY = process.env.AZURE_API_KEY
const AZURE_RESOURCE_NAME = process.env.AZURE_RESOURCE_NAME

const azureInf = createOpenAICompatible({
  name: 'azure-inf',
  baseURL: `https://${AZURE_RESOURCE_NAME}.services.ai.azure.com/models`,
  queryParams: {
    'api-version': API_VERSION,
  },
  fetch: async (url, request) => {
    return await fetch(url, { ...request, headers: {...request?.headers, 'api-key': AZURE_API_KEY} });
  },
});

.......

// Then you can use here
'chat-model-reasoning': wrapLanguageModel({
      model: azureInf('DeepSeek-R1'),
      middleware: extractReasoningMiddleware({ tagName: 'think' }),
    }),

alexcoimbra12 avatar Feb 17 '25 05:02 alexcoimbra12