dspy
dspy copied to clipboard
[Bug] Unable to run DSPy on AzureOpenAI
What happened?
Backing off 0.4 seconds after 1 tries calling function <function AzureOpenAI.request at 0x7fb1079549d0> with kwargs {} Backing off 1.9 seconds after 2 tries calling function <function AzureOpenAI.request at 0x7fb1079549d0> with kwargs {}
Steps to reproduce
Configure DSPy
dspy.settings.configure( lm=dspy.AzureOpenAI( api_base=AZURE_ENDPOINT, api_version=API_VERSION, api_key=API_KEY, model=DEPLOYMENT ) )
DSPy version
3.0.3
@ajayarunachalam Thanks for reporting the issue! This doesn't look like a DSPy error though, could you try directly calling litellm.completion with your setup?
Okay, sure. I will do that. Can you point any reference example?
@chenmoneygithub I tried with litellm and running into this error - BadRequestError: litellm.BadRequestError: AzureException BadRequestError - Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead. Just to your context: I am using GPT5 and I noticed there is an outstanding issue - https://github.com/BerriAI/litellm/issues/13432
Can you share the full reproduction code and the litellm version?
@ajayarunachalam are you using a deployment endpoint or response endpoint, this format is only applicable for deployment endpoint in azure.
With the deployed endpoint you can do it like this:
dspy.configure(
lm=dspy.LM(
"azure/gpt-4o-mini",
api_base="https://xxx.openai.azure.com/",
api_key="foobar"
)
)
This is how i setup my Azureopenai into the dspy, behind the scene it uses the litellm. This is how you can configure
`import os, dspy
lm = dspy.LM( model=f'azure/{deployment}', custom_llm_provider='azure', api_key=azure_key, api_base=azure_base, api_version=azure_version, )
dspy.configure(lm=lm) print('Configured DSPy with Azure OpenAI deployment =', deployment) print('Using Azure API version =', azure_version)`