graphrag
graphrag copied to clipboard
[Issue]: When we generate reports using LLM, how do we avoid the reports carrying LLM's own data?
Do you need to file an issue?
- [x] I have searched the existing issues and this bug is not already filed.
- [ ] My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
- [ ] I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.
Describe the issue
The current report data generated by LLM includes information from both indexed and non-indexed files. Even when additional instructions are added to the prompt, the impact is still not significant.
Remember:
- Every single statement must be explicitly supported by input data
- Include exact quotes in the validation section
- Track coverage of statements and their support
- If information isn't in the input data, don't include it
- Flag any statement that can't be directly tied to source data
Steps to reproduce
No response
GraphRAG Config Used
# Paste your config here
Logs and screenshots
No response
Additional Information
- GraphRAG Version:
- Operating System:
- Python Version:
- Related Issues:
You use what LLM?
#1543
I think it is a llm issue