connection error
Hi, please remain the ``api_key'' and try it again?
You need to set the api_type as : open_llm, instead of openai
wangzhaoyu21 @.***> 于2025年3月26日周三 09:43写道:
I also encountered this problem. I tried to change the base_url of the LLM : http://localhost:11434/v1., which is my configuration:
image.png (view on web) https://github.com/user-attachments/assets/d7f29d65-e297-4525-b6ec-9b193a1cc98e
— Reply to this email directly, view it on GitHub https://github.com/JayLZhou/GraphRAG/issues/56#issuecomment-2753032471, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMKM7U6MOTWUWDY53TF54YL2WIA3ZAVCNFSM6AAAAABZXPMTJKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJTGAZTENBXGE . You are receiving this because you commented.Message ID: @.***> [image: wangzhaoyu21]wangzhaoyu21 left a comment (JayLZhou/GraphRAG#56) https://github.com/JayLZhou/GraphRAG/issues/56#issuecomment-2753032471
I also encountered this problem. I tried to change the base_url of the LLM : http://localhost:11434/v1., which is my configuration:
image.png (view on web) https://github.com/user-attachments/assets/d7f29d65-e297-4525-b6ec-9b193a1cc98e
— Reply to this email directly, view it on GitHub https://github.com/JayLZhou/GraphRAG/issues/56#issuecomment-2753032471, or unsubscribe https://github.com/notifications/unsubscribe-auth/AMKM7U6MOTWUWDY53TF54YL2WIA3ZAVCNFSM6AAAAABZXPMTJKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDONJTGAZTENBXGE . You are receiving this because you commented.Message ID: @.***>
You can try setting the base-url of llm : http://localhost:11434/v1
You can try setting the base-url of llm : http://localhost:11434/v1
The result is still the same. Besides modifying the configuration file, what else might I need to check?Also, is the API_KEY required for this to work? I'm not able to install OpenWebUI and therefore I'm unsure how to set up and use the API_KEY for Ollama.
this is my config2.yaml,but i have another error
I gave up on using Ollama and switched to a cloud-based model, now it's running fine.
okay, i am glad to hear this