Appointat
Appointat
The specific OCR will be discussed and determined.
hi. If I remember correctly, the current data chat module does not support direct interaction with the graph database. Currently, only the graph rag module relies on the graph database....
> > hi. 如果我记得没错,当前的数据聊天模块不支持直接与图数据库交互。目前只有图 RAG 模块依赖于图数据库。 > > 您可以在文档中找到“GraphRAG”。[@danMe66](https://github.com/danMe66) > > 后续有望支持么?或者我们通过什么方式可以实现这一功能,感谢您的回复 [@Appointat](https://github.com/Appointat) Support for features similar to "chat with tugraph" requires the use of some fine-tuned text2gql models. The...
@coderabbitai resume
@willshang76 Thanks for your reviewing, I will reply your comments soon!
Hi @jiefei30 , I need more information from the server.log (the log information you provided only comes from a Litellm warning, which is a common warning but does not affect...
I think the `.env` config you are calling Litellm is incorrect. If you are using a private model, you should use the local model invocation method: `doc/zh-cn/deployment/config-env.md` and `https://docs.litellm.ai/docs/providers/vllm` (self-hosted...
Hi @watshare, Thanks for reaching out and providing the details. The key requirement for the embedding configuration is that the endpoint must be **OpenAI-compatible**. As outlined in our documentation (`doc/en-us/deployment/config-env.md`),...