graphrag
graphrag copied to clipboard
[Ollama][Other] GraphRAG OSS LLM community support
What I tried:
I ran this on my local GPU and and tried replacing the api_base to a model served on ollama in settings.yaml file.
model: llama3:latest
api_base: http://localhost:11434/v1 #https://
Error: graphrag.index.reporting.file_workflow_callbacks INFO Error Invoking LLM details={'input': '\n-Goal-\nGiven a text document that is pot....}
Commands: #initialize python -m graphrag.index --init --root .
#index python -m graphrag.index --root .
#query python -m graphrag.query --root . --method global "query"
#query python -m graphrag.query --root . --method local "query"
Does graphrag support other llm hosted server frameworks?