devika icon indicating copy to clipboard operation
devika copied to clipboard

No ollama LLM found

Open suoko opened this issue 1 year ago • 8 comments

I run the docker compose up command and all was installed correctly. I entered the ollama docker container and installed llama2 but when I run devika, no LLM is found for ollama. Should I configure something ? Or only some LLM are supported ? Starcoder is not seen either

Thanks image

suoko avatar Apr 05 '24 11:04 suoko

#300 adding reference to similar existing issue.

I am also facing the same issue.

cpAtor avatar Apr 05 '24 12:04 cpAtor

Any update on this one please. i am not able to select the local model

heartsiddharth1 avatar Apr 05 '24 17:04 heartsiddharth1

If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one

ARajgor avatar Apr 06 '24 04:04 ARajgor

I have the some problem.

ChanghongYangR avatar Apr 06 '24 05:04 ChanghongYangR

If you are using ollama docker the. Check the serve host of ollama. I guess you have to change it from the default one

The following worked for me:

  • Updating the OLLAMA API_ENDPOINT in config.toml as follows
        OLLAMA = "http://ollama-service:11434"
    
  • running docker compose up --build

cpAtor avatar Apr 07 '24 05:04 cpAtor

Which language model in ollama works properly for this project?

Ahmet0691 avatar Apr 07 '24 07:04 Ahmet0691

when I turned off my vpn connection, it worked. Uploading 屏幕截图 2024-04-07 133757.png…

ChanghongYangR avatar Apr 07 '24 11:04 ChanghongYangR

Any updates here? I'm running the Ollama but the Devika still cannot recognized it. 2024-04-26 12_26_43-Administrator_ Command Prompt - ollama  serve

2024-04-26 12_28_53-config toml - devika - Visual Studio Code  Administrator

kuendeee avatar Apr 26 '24 04:04 kuendeee