dkimmunichre
dkimmunichre
@excubo-jg hey thanks for the response. I believe the 'model' parameter does that for me, mostly because deployment id is only used if model doesn't exist in the inputs or...
Regardless I'll give it a shot and let you know!
@okhat Hi Omar. This issue is making me consider not adopting DSPY for some of my experiments... not sure if you believe I should open an issue on the Azure...
@excubo-jg could you share the version of openai you're using, as well as info on what parameters are you using? Also, which model type are you using?
@excubo-jg I also looked into the code and deployment_id is removed if "model" parameter is provided, which it is in my case. I'm curious as to how you're getting it...
@excubo-jg I've answered that point that "model" parameter works as a viable replacement for deployment_id( in fact deployment_id gets deleted from kwargs in the dspy.AzureOpenAI module when 'model' exists) What...
@franperic hey! Thank you for responding. What were your issues? I'm having no issues pinging... it just errors out for some reason with a 500. What method are you using...
> > @excubo-jg I've answered that point that "model" parameter works as a viable replacement for deployment_id( in fact deployment_id gets deleted from kwargs in the dspy.AzureOpenAI module when 'model'...
If "model" or "deployment_id" were handled differently, my error wouldn't be a 500, but a 400 or the request wouldn't even return anything because the URL would be wrong.
> In my set-up a value for model is sent and it is different from the deployment_id. Openai is 1.16.2 e.g. not legacy. Per your analysis this set-up cannot work...