ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Feature Request]: is there any way to use vllm llm service in ragflow?

Open qiufengyuyi opened this issue 1 year ago • 1 comments

Is there an existing issue for the same feature request?

  • [X] I have checked the existing issues.

Is your feature request related to a problem?

No response

Describe the feature you'd like

now ragflow supports xinference,ollama to integrate llm service, but since vllm can boost serving in production environ, i just want to know is there a way to integrate vllm service api to ragflow?

Describe implementation you've considered

No response

Documentation, adoption, use case

No response

Additional information

No response

qiufengyuyi avatar May 21 '24 11:05 qiufengyuyi

Same question here, is vLLM supported now or planned?

kevinbaby0222 avatar Sep 27 '24 08:09 kevinbaby0222

@qiufengyuyi @kevinbaby0222 Thanks for your suggestion, and apologies for the delayed response! ⏳🙏

Our product now supports model embedding for Ollama, Xinference, and vLLM — this should cover the feature you were looking for. 🤖✨

Please feel free to close this feature. If it remains open, we’ll include it in our upcoming round of issue cleanup. 🧹

Thanks again for your constructive feedback — we truly appreciate it! 💡🚀

which-W avatar May 12 '25 02:05 which-W