zhuang-weiming
Results
2
comments of
zhuang-weiming
> No matter where Ollama run, as long as the network is accessible, RAGFlow can utilize it anyway. Good to know, when I setup the LLM via ollama on localhost...
got you, will install ragflow in my local env. Please help to close this issue. Thank you~