DB-GPT icon indicating copy to clipboard operation
DB-GPT copied to clipboard

[Feature]: Azure OpenAI Support

Open Mshz2 opened this issue 1 year ago • 13 comments

Is your feature request related to a problem? Please describe. The server url fpr openai api is not applicable to azure openai Endpoint url.

Describe the solution you'd like OPENAI_TYPE=Azure PROXY_API_KEY=Azure key PROXY_SERVER_URL=https://resoucename.openai.azure.com

Describe alternatives you've considered I tried to do PROXY_API_KEY=******** PROXY_SERVER_URL=https://****.openai.azure.com/openai/deployments/chatgpt35/chat/completions?api-version=2023-03-15-preview But got bellow error in the frontend: image

Mshz2 avatar Aug 11 '23 12:08 Mshz2

Azure is not supported currently, can you give a pr for that?

csunny avatar Aug 12 '23 01:08 csunny

Azure is not supported currently, can you give a pr for that?

I would love to, but unfortunately I am not sure if I have sufficient skill. Anyway, I believe many would like to deploy your nice product in the cloud, which for instance cloud providers like azure is popular due to openai gpt model integration.

Mshz2 avatar Aug 12 '23 11:08 Mshz2

i can handle it @csunny

cason0126 avatar Nov 01 '23 09:11 cason0126

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

u can use config like this 。 it works @Mshz2

cason0126 avatar Nov 02 '23 06:11 cason0126

@cason0126 nice

yihong0618 avatar Nov 02 '23 06:11 yihong0618

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

riverind avatar Nov 20 '23 06:11 riverind

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

cason0126 avatar Nov 20 '23 08:11 cason0126

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

thx, you run success or not ?

if success, can i know your para about azure?

if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url

riverind avatar Nov 20 '23 09:11 riverind

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo u can use config like this 。 it works @Mshz2

PROXY_SERVER_URL is represented for? thks @cason0126

I guess it should be redundant configuration items. You can leave this part blank, or just fill in XXX as the value. As for why and iterative improvement, I am also learning.

thx, you run success or not ?

if success, can i know your para about azure?

if set the proxy_server_url to space, it will occur error, error info is as follows: dbgpt_server.py: error: the following arguments are required: --proxy_server_url

i can work success . my para is :
LLM_MODEL=proxyllm PROXY_API_KEY=your key PROXY_API_BASE=https://{your_domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx
PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

cason0126 avatar Nov 21 '23 08:11 cason0126

@csunny @cason0126 thanks for the feedback. It is working now ;)

By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002

Mshz2 avatar Nov 22 '23 16:11 Mshz2

LLM_MODEL=proxyllm PROXY_API_KEY={your key} PROXY_API_BASE=https://{your domain}.openai.azure.com/ PROXY_API_TYPE=azure PROXY_SERVER_URL=xxxx PROXY_API_VERSION=2023-05-15 PROXYLLM_BACKEND=gpt-35-turbo

Only change in above parameters : I used PROXYLLM_BACKEND="mydeplymentname" works for me

AbhijitManepatil avatar Feb 05 '24 12:02 AbhijitManepatil

EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002

@csunny @cason0126 thanks for the feedback. It is working now ;)

By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002

@Mshz2 Hi, have you figured out how to use azure embedding model?

Cualacin0 avatar May 08 '24 06:05 Cualacin0

EMBEDDING_MODEL=proxy_openai proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings) proxy_openai_proxy_api_key=password proxy_openai_proxy_backend=text-embedding-ada-002

@csunny @cason0126 thanks for the feedback. It is working now ;) By any chance, how can I change similarly for using the azure embedding model? I tried to do like this, but did not worked:

EMBEDDING_MODEL=proxy_openai
proxy_openai_proxy_server_url=https://myresourcename.openai.azure.com/openai/deployments/text-embedding-ada-002/embeddings)
proxy_openai_proxy_api_key=password
proxy_openai_proxy_backend=text-embedding-ada-002

@Mshz2 Hi, have you figured out how to use azure embedding model?

I'm not using the tool anymore))

Mshz2 avatar May 08 '24 10:05 Mshz2