No ollama LLM found
I run the docker compose up command and all was installed correctly. I entered the ollama docker container and installed llama2 but when I run devika, no LLM is found for ollama. Should I configure something ? Or only some LLM are supported ? Starcoder is not seen either
Thanks
#300 adding reference to similar existing issue.
I am also facing the same issue.
Any update on this one please. i am not able to select the local model
If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one
I have the some problem.
If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one
The following worked for me:
- Updating the
OLLAMAAPI_ENDPOINTinconfig.tomlas followsOLLAMA = "http://ollama-service:11434" - running
docker compose up --build
Which language model in ollama works properly for this project?
when I turned off my vpn connection, it worked.
Any updates here? I'm running the Ollama but the Devika still cannot recognized it.