[Feature]: Add support for llama2 on azure ai, use azure_ai/ provider
The Feature
https://github.com/BerriAI/litellm/blob/e044d6332f9d18a27f7c1bd794d7ef228428fa2c/docs/my-website/docs/providers/azure_ai.md?plain=1#L4
Azure AI supports multiple models beyond Mistral, this model_name pattern seems to be same https://litellm.vercel.app/docs/providers/mistral, and won't work if we switch this to llama2 model on azure_ai for example. Shouldn't it be azure_ai/{deployment_name}?
Motivation, pitch
Consistent naming conventions for model_name
Twitter / LinkedIn details
@ttdevelop
acknowledging this, when we added Azure AI users just wanted to use Mistral Azure AI. Marking this issue as add support for llama2 on azure ai. As part of that we'd need to move to azure_ai/mistral, azure_ai/llama2 etc
acknowledging this, when we added Azure AI users just wanted to use Mistral Azure AI. Marking this issue as add support for llama2 on azure ai. As part of that we'd need to move to azure_ai/mistral, azure_ai/llama2 etc
@ishaan-jaff thanks for the quick response. I also notice that despite Mistral being a chat model, the payload for the current completion() call is actually only returning the final Assistant response, instead of whole chain of messages, is that expected behavior?
I see the exact same response from the Azure endpoint when I query it with a Curl command, do you see any diff ?
I see the exact same response from the Azure endpoint when I query it with a Curl command, do you see any diff ?
oh yeah, I see that as well, never mind. I think I just mis-remembered something about expected response payload. Thanks for checking on it.
@ishaan-jaff any updates on this, there are now more model deployments for llama3 pay go on azure-ai, but not sure how to get to them through the current setup. the azure/ prefix seems to go to openai studio codepath
@ti3x live here: https://docs.litellm.ai/docs/providers/azure_ai