openai-node
openai-node copied to clipboard
support Microsoft Azure OpenAI service endpoints
Describe the feature or improvement you're requesting
Update the API configuration to support Azure openai endpoints as well.
In order to use the Python OpenAI library with Microsoft Azure endpoints, we need to set the api_type, api_base and api_version in addition to the api_key. The api_type must be set to 'azure' and the others correspond to the properties of your endpoint. In addition, the deployment name must be passed as the engine parameter.
python
import openai openai.api_type = "azure" openai.api_key = "..." openai.api_base = "https://example-endpoint.openai.azure.com" openai.api_version = "2022-12-01"
create a completion
completion = openai.Completion.create(engine="deployment-name", prompt="Hello world")
print the completion
print(completion.choices[0].text)
Additional context
No response
+1
Azure OpenAI API
Here is something I came up with that works for Azure OpenAI. I'm going to use this until OpenAI adds support.
You'll have to deploy a model in Azure make sure you name it the same as the selected deployed name, ie. text-davinci-003
is named text-davinci-003
, which is used in the Azure request as you can see below.
Another thing to note is Azure's completion object removes the data key from the completions object.
So instead of completion.data.choices[0].text
you'll get completion.choices[0].text
with your completion.
Feel free to add to GPT3Params
interface if it's missing keys you need.
Create a file and copy paste this class:
export interface Configuration {
basePath: string;
apiKey: string;
}
interface GPT3Params {
model: string;
prompt: string;
temperature?: number;
max_tokens?: number;
presence_penalty?: number;
stop?: string[];
}
export class AzureOpenAIApi {
basePath: string;
apiKey: string;
constructor(config: Configuration) {
this.basePath = config.basePath;
this.apiKey = config.apiKey;
}
async createCompletion(data: GPT3Params) {
const url = `${this.basePath}/openai/deployments/${data.model}/completions?api-version=2022-12-01`;
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'api-key': `${this.apiKey}`,
},
body: JSON.stringify(data),
});
return await response.json();
}
}
To use the AzureOpenAIApi class, first import it:
import { AzureOpenAIApi, Configuration } from 'path/to/AzureOpenAIApi';
Next, create an instance of the class and pass in a Configuration object as an argument to the constructor. The basePath
is your Azure endpoint that you'll get from the Azure OpenAI web console:
const configuration: Configuration = {
apiKey: process.env.OPENAI_API_KEY as string,
basePath: process.env.OPENAI_BASE_PATH as string,
};
const openai = new AzureOpenAIApi(configuration);
Finally, you can call the createCompletion
method on the AzureOpenAIApi
instance, passing in a GPT3Params
object as an argument:
const completion = await openai.createCompletion({
model: 'text-davinci-003',
prompt: 'What is the meaning of life?',
temperature: 0.6,
max_tokens: 500,
});
console.log(completion.choices[0].text);
Another take while they decide to add the api type:
import { Configuration, OpenAIApi } from "openai";
const configuration = new Configuration({
basePath: "https://[AZURE_OAI_BASEPATH]/openai/deployments/[MODEL_NAME]",
apiKey: "[API_KEY]",
});
const openai = new OpenAIApi(configuration);
export async function createCompletion(inputText) {
try {
const completion = await openai.createCompletion(
{
prompt: inputText,
},
{
headers: {
"api-key": configuration.apiKey,
},
params: {
"api-version": "2022-12-01",
},
}
);
return completion.data.choices[0].text;
} catch (err) {
console.log(err);
}
}
+1
I would be happy if the Node.js library could be used as easily as Python's OpenAI module. Using axios is a hassle.
I published a npm package to support Azure OpenAI API. check this repo: https://github.com/1openwindow/azure-openai-node here is the pacakge: https://www.npmjs.com/package/azure-openai
To migrate from the official OpenAI model to the Azure OpenAI model, you can just simply add azure info into configuration to migrate, that is it. You donot need to change any code. Please see the below steps:
-
Install the library by running the following command:
npm install azure-openai
-
Update the import statement from "openai" to "azure-openai":
//import { Configuration, OpenAIApi } from "openai"; import { Configuration, OpenAIApi } from "azure-openai";
-
Add the Azure OpenAI information to your project configuration:
this.openAiApi = new OpenAIApi( new Configuration({ apiKey: this.apiKey, // add azure info into configuration azure: { apiKey: {your-azure-openai-resource-key}, endpoint: {your-azure-openai-resource-endpoint}, deploymentName: {your-azure-openai-resource-deployment-name}, } }), );
-
run your code. That's it.
Any update on this? +1
even simpler than the example from jtvcodes: you can add the headers/params to the configuration as baseOptions
const apiKey = loadYourApiKeySomehow();
const configuration = new Configuration({
apiKey,
basePath: 'https://[your-deployment-name].openai.azure.com/openai/deployments/[your-model-name]',
baseOptions: {
headers: {'api-key': apiKey},
params: {
'api-version': '2023-03-15-preview' // this might change. I got the current value from the sample code at https://oai.azure.com/portal/chat
}
}
});
The completion call is then just
const completion = await openai.createCompletion({prompt});
@zoellner's solution works well. Here is some public information on the api-version
, which is required in the params
.
https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions
The upcoming v4 release of this library can be used with Azure as demonstrated in this example: https://github.com/openai/openai-node/blob/v4/examples/azure.ts
(it's quite similar to @zoellner 's example – thank you for that!)
We do hope to make usage with Azure more convenient in the future.
There's also a separate Azure OpenAI client here: https://www.npmjs.com/package/@azure/openai