NextChat
NextChat copied to clipboard
[Feature Request] 是否可以支持GLM4
🥰 需求描述
GLM4是国产优秀的大模型,请问何时可以支持GLM4 model/ API?
🧐 解决方案
GLM4是国产优秀的大模型,请问何时可以支持【GLM4】的 【本地部署】 或 【API部署】?
📝 补充信息
No response
Bot detected the issue body's language is not English, translate it automatically.
Title: [Feature Request] Whether it can support GLM4
🥰 Description of requirements
GLM4 is an excellent domestically produced large model. When will GLM4 model/API be supported?
🧐 Solution
GLM4 is an excellent domestically produced large model. When will [GLM4] be supported for [Local Deployment] or [API Deployment]?
📝 Supplementary information
No response
Bot detected the issue body's language is not English, translate it automatically.
Your mail has been received!
We will provide support in the future. It would be better if you can submit a PR directly. @wpfnlp