gpt-migrate
gpt-migrate copied to clipboard
Add azure support <Feature request>
Please add azure openai support that will receive a different openai endpoint.
just add a way to change the openai base url from the cli so we can use the azure endpoint or any proxy endpoint
+1
I believe this should now be covered with the LiteLLM integration @amitrahav @Ran-Mewo @vicdotdevelop
https://github.com/joshpxyne/gpt-migrate/blob/8f6dfdd45096716aa078fcd659e50aa04b44fec7/gpt_migrate/ai.py#L23
https://docs.litellm.ai/docs/providers/azure
Happy to make a PR if there's anything you feel is missing
cc: @joshpxyne
@krrishdholakia so if I'm using Azure, do I still have to set the env var OPENAI_API_KEY
?
Or would simply setting AZURE_API_KEY
, AZURE_API_BASE
, AZURE_API_VERSION
work?
Without OPENAI_API_KEY
, the command expects the env to be set.
When i set it to a dummy value, I get:
litellm.exceptions.APIConnectionError: Error code: 401 - {'error': {'message': 'Missing Authentication header', 'code': 401}}
I don't think Azure is supported with the current codebase sadly.