ragflow icon indicating copy to clipboard operation
ragflow copied to clipboard

[Feature Request]: Add LLM

Open yanwun opened this issue 6 months ago • 1 comments

Is there an existing issue for the same feature request?

  • [X] I have checked the existing issues.

Is your feature request related to a problem?

No response

Describe the feature you'd like

Hi there, I have a question here, I use Ollama to be my LLM provider, and I know that when add llm function called it will load model in the ollama service. But is there any possible that add LLM do not load model and when chat starting and then load model? because if there have mulitple user using the different fine-tune model in Ollama it will crash the GPU memory.

Describe implementation you've considered

No response

Documentation, adoption, use case

No response

Additional information

No response

yanwun avatar Aug 16 '24 01:08 yanwun