DB-GPT
DB-GPT copied to clipboard
[Feature]: Azure OpenAI Support
Is your feature request related to a problem? Please describe. The server url fpr openai api is not applicable to azure openai Endpoint url.
Describe the solution you'd like OPENAI_TYPE=Azure PROXY_API_KEY=Azure key PROXY_SERVER_URL=https://resoucename.openai.azure.com
Describe alternatives you've considered
I tried to do
PROXY_API_KEY=********
PROXY_SERVER_URL=https://****.openai.azure.com/openai/deployments/chatgpt35/chat/completions?api-version=2023-03-15-preview
But got bellow error in the frontend:
Azure is not supported currently, can you give a pr for that?
Azure is not supported currently, can you give a pr for that?
I would love to, but unfortunately I am not sure if I have sufficient skill. Anyway, I believe many would like to deploy your nice product in the cloud, which for instance cloud providers like azure is popular due to openai gpt model integration.
i can handle it @csunny
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo
u can use config like this 。 it works @Mshz2
@cason0126 nice
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo
u can use config like this 。 it works @Mshz2
PROXY_SERVER_URL is represented for? thks @cason0126
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2
PROXY_SERVER_URL is represented for? thks @cason0126
I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2
PROXY_SERVER_URL is represented for? thks @cason0126
I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.
thx, you run success or not ?
if success, can i know your para about azure?
if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2
PROXY_SERVER_URL is represented for? thks @cason0126
I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.
thx, you run success or not ?
if success, can i know your para about azure?
if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url
i can work success . my para is :
LLM_MODEL=proxyllm
PROXY_API_KEY=your key
PROXY_API_BASE=https://{your_domain}.openai.azure.com/
PROXY_API_TYPE=azure
PROXY_SERVER_URL=xxxx
PROXY_API_VERSION=2023-05-15
PROXYLLM_BACKEND=gpt-35-turbo
@csunny @cason0126 thanks for the feedback. It is working now ;)
By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:
EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002
LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo
Only change in above parameters : I used PROXYLLM_BACKEND="mydeplymentname" works for me
EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002
@csunny @cason0126 thanks for the feedback. It is working now ;)
By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:
EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002
@Mshz2 Hi, have you figured out how to use azure embedding model?
EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002
@csunny @cason0126 thanks for the feedback. It is working now ;) By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:
EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002
@Mshz2 Hi, have you figured out how to use azure embedding model?
I'm not using the tool anymore))