Tuan Anh Nguyen Dang (Tadashi)
Tuan Anh Nguyen Dang (Tadashi)
Noted in the to-do list.
So the examples we used following Ollama OpenAI API specs `https://github.com/ollama/ollama/blob/main/docs/openai.md#curl` Please use the Test connection feature to make sure the Ollama connection working properly for both LLM & embedding...
@S4Spares How do you connect to local LLM from the Docker image? Through Ollama? Did use the test connection feature?
Working on this to reproduce the issue.
Meanwhile could anyone test will `main-ollama` image to check if work but the `main-full` is not?
Make sure that you set env var `USE_CUSTOMIZED_GRAPHRAG_SETTING=true` in `docker run` command.
@S4Spares seems the problem is `"ghcr.io/cinnamon/kotaemon:latest-full"` image name is not correct for the latest version. You can try out `ghcr.io/cinnamon/kotaemon:main-full` to get the latest development image.
Checkout the latest README https://github.com/Cinnamon/kotaemon  Currently support local LLM via Ollama is the only option.
@thelamer working on a fix.
May be the endpoint is not correctly set. Please see https://github.com/microsoft/Form-Recognizer-Toolkit/issues/40#issuecomment-2095048075