ai icon indicating copy to clipboard operation
ai copied to clipboard

AzureOpenAI support

Open swiecki opened this issue 2 years ago • 7 comments

Thanks so much for the great package.

Using Azure endpoints instead of openAI API requires being able to specify the api endpoint url and give your api key slightly differently, specifically a header of api-key: YOURKEY vs openai's Authorization: Bearer X header.

My question is how would you prefer Azure be implemented? I see that now already it is possible to pass in headers. Is that the preferred way moving forward? If not, please advise how you would want the integration to look and I can submit a pull request. Thank you!

swiecki avatar Jun 17 '23 00:06 swiecki

I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.

  const llm = new ChatOpenAI({
    streaming: true,
    callbackManager: CallbackManager.fromHandlers(handlers),
    azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
    azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
    azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
    azureOpenAIApiDeploymentName:
      process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
  })

swiecki avatar Jun 17 '23 00:06 swiecki

I think you can use openai, instead of ChatOpenAI from Langchain : import { ChatOpenAI } from "langchain/chat_models/openai"; doc: https://js.langchain.com/docs/modules/models/chat/

CarlosZiegler avatar Jun 17 '23 20:06 CarlosZiegler

I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.

  const llm = new ChatOpenAI({
    streaming: true,
    callbackManager: CallbackManager.fromHandlers(handlers),
    azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
    azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
    azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
    azureOpenAIApiDeploymentName:
      process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
  })

I think the main issue with the openai-edge package is, that it does not provide a way to add query parameters. However, this is required to use the Azure OpenAI endpoints.

To call an Azure OpenAI endpoint, a consumer must provide the api-version query parameter.

https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference

benvp avatar Jun 19 '23 14:06 benvp

I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.

  const llm = new ChatOpenAI({
    streaming: true,
    callbackManager: CallbackManager.fromHandlers(handlers),
    azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
    azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
    azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
    azureOpenAIApiDeploymentName:
      process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
  })

I think the main issue with the openai-edge package is, that it does not provide a way to add query parameters. However, this is required to use the Azure OpenAI endpoints.

To call an Azure OpenAI endpoint, a consumer must provide the api-version query parameter.

https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference

I have it working like in your example 🤔 add query parameters are added correctly, can you tell me what is the issue? maybe I can help 😅

luisFilipePT avatar Jun 21 '23 13:06 luisFilipePT

I opened an issue over at the openai-edge repo. https://github.com/dan-kwiat/openai-edge/issues/8

benvp avatar Jun 21 '23 13:06 benvp

Any updates on this? I;'d love to use my Azure key with the sdk

rdvo avatar Jun 21 '23 20:06 rdvo

So this is a bit weird but what worked for me was swapping these two values when creating new ChatOpenAI.

azureOpenAIApiInstanceName: MAKE_THIS_YOUR_DEPLOYMENT_NAME,
azureOpenAIApiDeploymentName: MAKE_THIS_YOUR_INSTANCE_NAME,

I noticed that it was actually calling {DeploymentName}.openai.azure.com.

4ortytwo avatar Jun 26 '23 14:06 4ortytwo

@ovived you can now pass query params to the openai-edge config: https://github.com/dan-kwiat/openai-edge/issues/8

dan-kwiat avatar Jul 07 '23 11:07 dan-kwiat

Closing as this has been resolved upstream, see https://github.com/dan-kwiat/openai-edge/issues/8#issuecomment-1634334338

const endpoint = `https://${AZURE_OPENAI_API_INSTANCE_NAME}.openai.azure.com`
export const config = new Configuration({
    // apiKey: AZURE_OPENAI_API_KEY, // OPENAI_KEY,
    basePath: `${endpoint}/openai/deployments/${AZURE_OPENAI_API_DEPLOYMENT_NAME}`,
    defaultQueryParams: new URLSearchParams({
        "api-version": AZURE_OPENAI_API_VERSION
    }),
    baseOptions: {
        headers: {
            "api-key": AZURE_OPENAI_API_KEY,
        }
    }
})

MaxLeiter avatar Jul 17 '23 23:07 MaxLeiter

where do I need to pass configuration?