graphrag
graphrag copied to clipboard
[Bug]: concurrent_requests 25 has no efftect in graphrag since use fnllm with vllm/sglang
Do you need to file an issue?
- [ ] I have searched the existing issues and this bug is not already filed.
- [ ] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
- [ ] I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.
Describe the bug
when use openai compatible model with vllm/sglang since graphrag 1.0 using fnllm , concurrent_requests =25 has no efftect :
when use graphrag 0.5
Steps to reproduce
No response
Expected Behavior
No response
GraphRAG Config Used
# Paste your config here
Logs and screenshots
No response
Additional Information
- GraphRAG Version:
- Operating System:
- Python Version:
- Related Issues:
Do you need to file an issue?
- [ ] I have searched the existing issues and this bug is not already filed.[ ] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.[ ] I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.
Describe the bug
when use openai compatible model with vllm/sglang since graphrag 1.0 using fnllm , concurrent_requests =25 has no efftect :
when use graphrag 0.5
Steps to reproduce
No response
Expected Behavior
No response
GraphRAG Config Used
Paste your config here
Logs and screenshots
No response
Additional Information
- GraphRAG Version:
- Operating System:
- Python Version:
- Related Issues:
Have you resolved it?
I'm facing the same issue. The knowledge graph construction process is very slow.