ai
ai copied to clipboard
AzureOpenAI support
Thanks so much for the great package.
Using Azure endpoints instead of openAI API requires being able to specify the api endpoint url and give your api key slightly differently, specifically a header of api-key: YOURKEY vs openai's Authorization: Bearer X header.
My question is how would you prefer Azure be implemented? I see that now already it is possible to pass in headers. Is that the preferred way moving forward? If not, please advise how you would want the integration to look and I can submit a pull request. Thank you!
I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.
const llm = new ChatOpenAI({
streaming: true,
callbackManager: CallbackManager.fromHandlers(handlers),
azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION,
azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY,
azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME,
azureOpenAIApiDeploymentName:
process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME,
})
I think you can use openai, instead of ChatOpenAI from Langchain : import { ChatOpenAI } from "langchain/chat_models/openai"; doc: https://js.langchain.com/docs/modules/models/chat/
I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.
const llm = new ChatOpenAI({ streaming: true, callbackManager: CallbackManager.fromHandlers(handlers), azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME, azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, })
I think the main issue with the openai-edge package is, that it does not provide a way to add query parameters. However, this is required to use the Azure OpenAI endpoints.
To call an Azure OpenAI endpoint, a consumer must provide the api-version query parameter.
https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference
I've realized using your Langchain interface it's very easy to connect to Azure OpenAI service as shown below. I guess the only remaining question is how to do this with the openai-edge package.
const llm = new ChatOpenAI({ streaming: true, callbackManager: CallbackManager.fromHandlers(handlers), azureOpenAIApiVersion: process.env.AZURE_OPENAI_API_VERSION, azureOpenAIApiKey: process.env.AZURE_OPENAI_API_KEY, azureOpenAIApiInstanceName: process.env.AZURE_OPENAI_INSTANCE_NAME, azureOpenAIApiDeploymentName: process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME, })I think the main issue with the
openai-edgepackage is, that it does not provide a way to add query parameters. However, this is required to use the Azure OpenAI endpoints.To call an Azure OpenAI endpoint, a consumer must provide the
api-versionquery parameter.https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference
I have it working like in your example 🤔 add query parameters are added correctly, can you tell me what is the issue? maybe I can help 😅
I opened an issue over at the openai-edge repo.
https://github.com/dan-kwiat/openai-edge/issues/8
Any updates on this? I;'d love to use my Azure key with the sdk
So this is a bit weird but what worked for me was swapping these two values when creating new ChatOpenAI.
azureOpenAIApiInstanceName: MAKE_THIS_YOUR_DEPLOYMENT_NAME,
azureOpenAIApiDeploymentName: MAKE_THIS_YOUR_INSTANCE_NAME,
I noticed that it was actually calling {DeploymentName}.openai.azure.com.
@ovived you can now pass query params to the openai-edge config:
https://github.com/dan-kwiat/openai-edge/issues/8
Closing as this has been resolved upstream, see https://github.com/dan-kwiat/openai-edge/issues/8#issuecomment-1634334338
const endpoint = `https://${AZURE_OPENAI_API_INSTANCE_NAME}.openai.azure.com`
export const config = new Configuration({
// apiKey: AZURE_OPENAI_API_KEY, // OPENAI_KEY,
basePath: `${endpoint}/openai/deployments/${AZURE_OPENAI_API_DEPLOYMENT_NAME}`,
defaultQueryParams: new URLSearchParams({
"api-version": AZURE_OPENAI_API_VERSION
}),
baseOptions: {
headers: {
"api-key": AZURE_OPENAI_API_KEY,
}
}
})
where do I need to pass configuration?