NextChat
NextChat copied to clipboard
Add Azure OpenAI API support.
Azure OpenAI product details protal: https://portal.azure.com/#view/Microsoft_Azure_ProjectOxford/CognitiveServicesHub/~/OpenAI
Effect snapshot:
A fork for quick preview https://chatgpt.realduang.com/
Introduction to use: If you want to use the Azure OpenAI API features, you will need to register your service in the portal above.
It requires passing in three mandatory parameters for normal use, custom domain, instance name, and api key. You can refer to the product documentation for these parameters.
It is worth noting that the custom subdomain value is the value that you will get from the detail list when you have registered with Azure OpenAI.
fixed the code conflict. Need to deploy.
@Yidadaa 请问你有计划merge这个PR嘛?
这是已经支持了么? 怎么设置
nice try
惊了,居然有这么多人也有相同的需求,那我修一版
@Yidadaa I have fixed all the code for conflict, and supplied a demonstration of the effects and brief instructions. If you think this feature is also needed, please review it.
Of course, if you got a chance, please help me to supply the feature description in README, thanks. I didn't find a suitable place to add this.
可以用了吗
可以用了吗
@daidi 如果你已经有azure资源了,在这个PR 被 merge 前,可以去我的 fork 先体验一波支持了 azure openai 的版本,传送门:https://chatgpt.realduang.com/
有好的改进建议也可以回来这里提。
大佬,现在可以用了吗
这个可用了吗
@realDuang 强烈需要azure openAI service 功能,现在可以用了没? 谢谢
可以用了吗
@daidi 如果你已经有azure资源了,在这个PR 被 merge 前,可以去我的 fork 先体验一波支持了 azure openai 的版本,传送门:https://chatgpt.realduang.com/
有好的改进建议也可以回来这里提。
我尝试部署了你fork出来的版本的确可以在设置里设置Azure OpenAI了,不过请问一下有木有方法可以在服务器层面上环境变量的方式配置AzureOpenAI的参数呢?类似OPENAI_API_KEY这样的?再次感谢你的工作。
可以用了吗
@daidi 如果你已经有azure资源了,在这个PR 被 merge 前,可以去我的 fork 先体验一波支持了 azure openai 的版本,传送门:https://chatgpt.realduang.com/ 有好的改进建议也可以回来这里提。
我尝试部署了你fork出来的版本的确可以在设置里设置Azure OpenAI了,不过请问一下有木有方法可以在服务器层面上环境变量的方式配置AzureOpenAI的参数呢?类似OPENAI_API_KEY这样的?再次感谢你的工作。
这个需求蛮实用的,这样就可以部署直接使用了。
One suggestion is, besides the global settings, it will be better to support using openAI service in chat settings. I mean, if the global setting sets using OpenAI, users can still set a specific chat to use Azure OpenAI Services. Anyway, this PR is really cool! Thanks for the contribution!
One suggestion is, besides the global settings, it will be better to support using openAI service in chat settings. I mean, if the global setting sets using OpenAI, users can still set a specific chat to use Azure OpenAI Services. Anyway, this PR is really cool! Thanks for the contribution!
This may require a refactoring of the project logic. As of now, there are no separate connection Settings for each chat window, and only the chat context is not shared. Such a change will not be attempted in this PR until it is clear that the repo owner wants it.
我尝试部署了你fork出来的版本的确可以在设置里设置Azure OpenAI了,不过请问一下有木有方法可以在服务器层面上环境变量的方式配置AzureOpenAI的参数呢?类似OPENAI_API_KEY这样的?再次感谢你的工作。
这是一个不错的提案。从代码上看应该是在CODE校验通过后通过 process.env
获取到 vercel 相应的环境变量即可。
不过,在此种情况下,用户仅输入一个 code,owner 如何知道自己消耗的是哪一方的 token ?用不同的 code 来匹配不同的 api 是否更好的方式?
我想这个 feature request 可以通过另外一个issue来收集一下倾向,就不干扰这个PR的原子性了。
One suggestion is, besides the global settings, it will be better to support using openAI service in chat settings. I mean, if the global setting sets using OpenAI, users can still set a specific chat to use Azure OpenAI Services. Anyway, this PR is really cool! Thanks for the contribution!
This may require a refactoring of the project logic. As of now, there are no separate connection Settings for each chat window, and only the chat context is not shared. Such a change will not be attempted in this PR until it is clear that the repo owner wants it.
I create a PR to your repo to block the model config UI when using Azure OpenAI Services. For further per chat session config, your branch will be helpful.
我尝试部署了你fork出来的版本的确可以在设置里设置Azure OpenAI了,不过请问一下有木有方法可以在服务器层面上环境变量的方式配置AzureOpenAI的参数呢?类似OPENAI_API_KEY这样的?再次感谢你的工作。
这是一个不错的提案。从代码上看应该是在CODE校验通过后通过
process.env
获取到 vercel 相应的环境变量即可。不过,在此种情况下,用户仅输入一个 code,owner 如何知道自己消耗的是哪一方的 token ?用不同的 code 来匹配不同的 api 是否更好的方式?
我想这个 feature request 可以通过另外一个issue来收集一下倾向,就不干扰这个PR的原子性了。
体验上的,感觉可以类似 opeonai API 那样,在设置面板可以选择是否启用 azure API 的模式,如果环境变量有设置了 azure 参数,则默认是环境变量的参数配置。
@realDuang 问个小白问题,一直没找到方法。我已经fork了Yidadaa/ChatGPT-Next-Web,但是我再fork realDuang/ChatGPT-Next-Web这个项目的时候提示项目已存在不能在fork了(No more forks can be created. These forks already exist),我不能同时fork这两个项目吗?而且我也改了项目的名称,还是这个提示
@realDuang 问个小白问题,一直没找到方法。我已经fork了Yidadaa/ChatGPT-Next-Web,但是我再fork realDuang/ChatGPT-Next-Web这个项目的时候提示项目已存在不能在fork了(No more forks can be created. These forks already exist),我不能同时fork这两个项目吗?而且我也改了项目的名称,还是这个提示
You cannot fork an original project twice. But you can still do the following steps on your local machine:
git remote add realDuang https://github.com/realDuang/ChatGPT-Next-Web.git
git fetch realDuang
git checkout -b duang/azure_openai realDuang/duang/azure_openai
@realDuang 问个小白问题,一直没找到方法。我已经fork了Yidadaa/ChatGPT-Next-Web,但是我再fork realDuang/ChatGPT-Next-Web这个项目的时候提示项目已存在不能在fork了(No more forks can be created. These forks already exist),我不能同时fork这两个项目吗?而且我也改了项目的名称,还是这个提示
You cannot fork an original project twice. But you can still do the following steps on your local machine:
git remote add realDuang https://github.com/realDuang/ChatGPT-Next-Web.git
git fetch realDuang
git checkout -b duang/azure_openai realDuang/duang/azure_openai
但如果我本地拉取了项目就没法部署到vercel上了
@realDuang 问个小白问题,一直没找到方法。我已经fork了Yidadaa/ChatGPT-Next-Web,但是我再fork realDuang/ChatGPT-Next-Web这个项目的时候提示项目已存在不能在fork了(No more forks can be created. These forks already exist),我不能同时fork这两个项目吗?而且我也改了项目的名称,还是这个提示
You cannot fork an original project twice. But you can still do the following steps on your local machine:
git remote add realDuang https://github.com/realDuang/ChatGPT-Next-Web.git
git fetch realDuang
git checkout -b duang/azure_openai realDuang/duang/azure_openai
但如果我本地拉取了项目就没法部署到vercel上了
You can push it to your repo's main branch and import your repo to vercel if you don't mind.
Before I make any further changes, I would like to ask the repo owner @Yidadaa what you thinks about this PR. I haven't seen any of your comments yet. If you have any concerns, please let me know. This PR has been up for nearly two months and is a little tired of fixing code conflicts.
@realDuang
First of all, I want to state that this PR is an excellent contribution that meets the needs of many people.
However, I have been considering how to balance the server-side and client-side configurations. Introducing additional configuration options will greatly increase the complexity of the project. This PR adds azure api configuration options in the settings page, which I don't think is a good choice, especially when the openai-azure-proxy
project already meets the usage requirements. A better solution is to combine the openai-azure-proxy
and this project in docker compose, which can easily meet self-hosting requirements. As for deployment on Vercel, simply deploy the two projects separately and modify the BASE_URL
accordingly.
When accepting this PR, I am more concerned about decoupling the front-end and back-end, which will make it easier to package the desktop app in the future.
@realDuang
First of all, I want to state that this PR is an excellent contribution that meets the needs of many people.
However, I have been considering how to balance the server-side and client-side configurations. Introducing additional configuration options will greatly increase the complexity of the project. This PR adds azure api configuration options in the settings page, which I don't think is a good choice, especially when the
openai-azure-proxy
project already meets the usage requirements. A better solution is to combine theopenai-azure-proxy
and this project in docker compose, which can easily meet self-hosting requirements. As for deployment on Vercel, simply deploy the two projects separately and modify theBASE_URL
accordingly.When accepting this PR, I am more concerned about decoupling the front-end and back-end, which will make it easier to package the desktop app in the future.
Thanks for the reply. Your concern makes sense. Just my personal opinion, why not checkout a new branch to try it so we can put effort on this and see what it will turn out to be?
@realDuang
First of all, I want to state that this PR is an excellent contribution that meets the needs of many people.
However, I have been considering how to balance the server-side and client-side configurations. Introducing additional configuration options will greatly increase the complexity of the project. This PR adds azure api configuration options in the settings page, which I don't think is a good choice, especially when the
openai-azure-proxy
project already meets the usage requirements. A better solution is to combine theopenai-azure-proxy
and this project in docker compose, which can easily meet self-hosting requirements. As for deployment on Vercel, simply deploy the two projects separately and modify theBASE_URL
accordingly.When accepting this PR, I am more concerned about decoupling the front-end and back-end, which will make it easier to package the desktop app in the future.
@Yidadaa
Thank you for letting me see your concern, which is also very enlightening to me.
One thing I may need to supply is, on Vercel, you may not just need to change the BASE_URL, AOAI uses a different model and parameter passing than openai does. so, even if you consider doing the configuration on the server side, you still need these parameters, and it may pay more effort to change personal tokens.
As for using openai-azure-proxy in combination, this is a good approach, but in my personal experience so far, I prefer to use it on the web side without any dependencies.
Anyway, If you still have better plan to achieve it, before that maybe we can keep this PR open if you like. I'll maintain this feature in my fork and try to keep it sync with you.
现在官方的 openai-python 口子也支持使用 azure 了,很期待这个 PR。另外从用户的角度来看,这个库主要受众群体是国人还是全世界的人? 如果我没猜错的话,应该期望是受众群体是全世界,但实际上支持这个库的主要是国人,毕竟他们能够直接用openai没必要再套一层皮了。如果是国人的话开通 azure 意图很明显,就是不想走代理,openai-azure-proxy 还要绕一下。
现在官方的 openai-python 口子也支持使用 azure 了,很期待这个 PR。另外从用户的角度来看,这个库主要受众群体是国人还是全世界的人? 如果我没猜错的话,应该期望是受众群体是全世界,但实际上支持这个库的主要是国人,毕竟他们能够直接用openai没必要再套一层皮了。如果是国人的话开通 azure 意图很明显,就是不想走代理,openai-azure-proxy 还要绕一下。
I'm afraid this comment is kinda narrowing the scenario where azure openai apis are used. It's never just handling a proxy issue for people choosing azure openai service.
Very confused why reject this PR or why doesn't support Azure Open API natively. It's a very hot requirement...
Very confused why reject this PR or why doesn't support Azure Open API natively. It's a very hot requirement...
因为调用方式不同