katmai
katmai
> > the models list is still wrong. that is not the model list i have loaded in ollama locally. that's just a random list of models that ollama has...
That's the setup for local LLM outlined in the docs here: https://github.com/OpenDevin/OpenDevin/blob/main/docs/documentation/LOCAL_LLM_GUIDE.md
> for @katmai, in his config, he used ollama but the log shows as llama2 for @imperiousprashant in his config, he used gpt4 but the log shows as meta-llama/ >...