change Endpoint to Azure
Since the open AI API is available in azure, is there a possibility to change to endpoint to azure. or will there be a plan to add this feature
Hi @timmoh
Thank you for raising this issue.
I took a closer look into the Azure documentation and it looks like it is not enough to just change the endpoint url. It additionally requires an api-version query parameter.
See: https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#completions
And the endpoint url itself is not static too as it contains your personal resource-name and deployment-id
Are you using Azure and can you confirm my findings?
Just jumping in here, I can confirm this. The URL structure is for example:
{openai_api_base}/openai/deployments/{deployment_name}/completions?api-version={openai_api_version}
Hi @duellsy
Thank you for the confirmation.
I think the upcoming change to the package would help here: https://github.com/openai-php/client/pull/75
The only thing we would have to add is a new method to the factory where arbitrary values can be passed to be used as query parameters:
->withQueryParams(['api-version' => '<version>'])
Neat 🙌
I couldn't see how that would end up with the correct full endpoint as above though? Unless you were meaning that it's currently a big step in that direction, in which case my apologies.
Sorry, if it wasn't clear 😅 Here is the full example:
$client = OpenAI::factory()
->withBaseUrl('https://{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}')
->withHttpHeader('api-key', '{your-api-key}')
->withQueryParams(['api-version' => '{version}'])
->make();
Ha! Beat me to it, I was just about to say the entire URL up to /completions could be used as the base URL ✌️
This should now work using the newly introduced factory:
$client = OpenAI::factory()
->withApiKey('<your-api-key>')
->withBaseUri('{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}')
->withHttpHeader('api-key', '{your-api-key}')
->withQueryParam('api-version', '{version}')
->make();
@duellsy Could you please verify this for us?
Nice one. It works well, a few notes:
->withApiKey() is not needed, since ->withHttpHeader() handles it.
So the setup needed is just:
$client = OpenAI::factory()
->withBaseUri('{your-resource-name}.openai.azure.com/openai/deployments/{deployment-id}')
->withHttpHeader('api-key', '{your-api-key}')
->withQueryParam('api-version', '{version}')
->make();
With Azure, you need to deploy a model (which is the {deployment-id}) so the model is already baked into that, meaning you don't need to supply the model when making the calls as it's part of the BaseUri (it does nothing if you supply it, so not a cause for concern).
So a minimal sample completion call is just:
$result = $client->completions()->create([
'prompt' => 'PHP is'
]);
@duellsy I thought it would be nice to have a section in the Readme about this. So I have stolen some parts of your answer.
Glad if you could give it a look as I am not into Azure at all ;)
https://github.com/openai-php/client/pull/109/files
LGTM 👍