lobe-chat
lobe-chat copied to clipboard
[Request] 建议增加对使用Xinference本地部署的支持,不单单是Ollama
🥰 需求描述
建议增加对使用Xinference本地部署的支持,不单单是Ollma
🧐 解决方案
增加接口对接支持即可
📝 补充信息
No response
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
🥰 Description of requirements
It is recommended to add support for local deployment using Xinference, not just Ollma
🧐 Solution
Just add interface docking support
📝 Supplementary information
No response
👀 @MarkHappyShao
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
@MarkHappyShao 给个理由?
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
@MarkHappyShao Give a reason?
在这里添加: https://github.com/lobehub/lobe-chat/discussions/1284
✅ @MarkHappyShao
This issue is closed, If you have any questions, you can comment and reply.
此问题已经关闭。如果您有任何问题,可以留言并回复。