Kevin Hu
Kevin Hu
> empty Is this locally deployed?
Could you share a sample of your file? Let me debug.
It's about XInference, you could address this issue to [this](https://github.com/xorbitsai/inference/issues)
Good feature. This is what we want to do next....
I uploded on the demo website. It works fine. If you deployed locally, have a look at logs using docker logs -f ragflow-server
I think RAGFlow's Q&A already meets this requirement well. The reason we use RAG is that LLM can identify similar questions rather than we need to enumerate all the question...
Correct. Search and retrieval don't involve with LLM.
We will provide an API for this soon. BTW, could you specify the input and output requirement?
Could you share a sample of your file?
I didn't reproduce the case using this file. What about new another knowledge base and upload file? 