[Bug]: <title> In the case, knowledge_graph.html cannot be displayed well
Do you need to file an issue?
- [x] I have searched the existing issues and this bug is not already filed.
- [x] I believe this is a legitimate bug, not just a question or feature request.
Describe the bug
I deployed lightrag according to the official steps, but my python3.10 and ollama local large model generation case will result in an incomplete knowledge graph. What should I do?
Steps to reproduce
Follow the official video
Expected Behavior
No response
LightRAG Config Used
Paste your config here
Logs and screenshots
No response
Additional Information
- LightRAG Version:
- Operating System:
- Python Version:
- Related Issues:
You could try using bigger model, because lightRAG knowledge graph generation heavily relied on the LLM capability.
had a similar issue. was able to reduce 'orphan' entities be using a better LLM and also reducing chunk size to a value between 200 to 400 and overlap to a value between 20-100.
Ok, I'll try it. Thanks.
I am facing similar issue. Which better LLM are you referring to ? @BireleyX . I have this set up in Mac Mini M4, 16GB RAM - 2024 Model. I am also using local large models are gemma2:2b and nomic-embed-text:latest
what i meant by better LLM was change from gpt-4o-mini to gpt-4o. when possible, use the model variant with the most parameters your hardware (in case of local AI) can support.