š Feature: Support for Azure Openai API
š Feature description
I think support for Azure Openai API would be great.
as so many developers and organizations use Azure to fulfill their Cloud Needs.
So adding up a support for azure Openai API will benefit them to use within their environment.
š¤ Pitch
I think support for Azure Openai API would be great.
as so many developers and organizations use Azure to fulfill their Cloud Needs.
So adding up a support for azure Openai API will benefit them to use within their environment.
š Have you spent some time to check if this issue has been raised before?
- [X] I checked and didn't find similar issue
š¢ Have you read the Code of Conduct?
- [X] I have read the Code of Conduct
Hi @dholakashyap - I believe we can make this easier Iām the maintainer of LiteLLM - we allow you to deploy a LLM proxy to call 100+ LLMs in 1 format - Azure OpenAI, PaLM, Bedrock, OpenAI, Anthropic etc https://github.com/BerriAI/litellm/tree/main/openai-proxy.
If this looks useful (we're used in production)- please let me know how we can help.
Usage
Azure request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "azure/<your-deployment-name>",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
gpt-3.5-turbo request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-3.5-turbo",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'
claude-2 request
curl http://0.0.0.0:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-2",
"messages": [{"role": "user", "content": "Say this is a test!"}],
"temperature": 0.7
}'