How to set the graphRAG with local ollama
Description
How to set the graphRAG with local ollama?
Reproduction steps
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
Screenshots

Logs
No response
Browsers
No response
OS
No response
Additional information
No response
I briefly look through code noticed explicit reference to OpenAI eg https://github.com/Cinnamon/kotaemon/blob/8be8a4a9d048d551aaf79b98e235ea41ec94b695/libs/ktem/ktem/index/file/graph/pipelines.py#L232, it might not be ready for local llms
Another issue is that graphrag is a separate program with own configuration. Which could be configured to other llm but it's a separate question https://github.com/microsoft/graphrag/issues/657
I briefly look through code noticed explicit reference to OpenAI eg
https://github.com/Cinnamon/kotaemon/blob/8be8a4a9d048d551aaf79b98e235ea41ec94b695/libs/ktem/ktem/index/file/graph/pipelines.py#L232 , it might not be ready for local llms
I stuck here
Entity count: 318
2024-09-09 19:15:38.962 - openai._base_client/DEBUG - Sending HTTP Request: POST https://api.openai.com/v1/embeddings
Error embedding chunk {'OpenAIEmbedding': "Error code:
I think it should pick default configured embeddings instead.
@mkhludnev thanks. I can build the graphRAG and run the pipeline successfully with my local ollama after a investigation about graphrag component.
But I got error which point to the mismatch of embedding between openai and ollama during the query stage. This error is hidden in the kotaemon pipeline, seems a lot of work to fix it.
if someone can run ollama for graphrag. Please share the steps to us :)
Finally, it works. I will make a PR to this repo
Please check the latest version and instruction for setting up GraphRAG here https://github.com/Cinnamon/kotaemon#setup-graphrag.