MetaGPT icon indicating copy to clipboard operation
MetaGPT copied to clipboard

Unable to configure ollama via 3rd party url

Open yrobla opened this issue 1 day ago • 0 comments

Bug description

I am trying to use MetaGPT via a third party url wrapper. It works for me with openai and claude:

llm:
  api_type: "openai"  # or azure / ollama / groq etc.
  model: "gpt-4-turbo"  # or gpt-3.5-turbo
  base_url: "http://localhost:8989/openai"
  api_key: "xxxx"

But when i configure ollama, i have a problem:

llm:
  api_type: "ollama"  # or azure / ollama / groq etc.
  model: "llama2"  # or gpt-3.5-turbo
  base_url: "http://localhost:8989/ollama"

But it fails with a 404 not found, because the base URL needs to be: /api/chat and it just gets /chat

    @property
    def api_suffix(self) -> str:
        return "/chat"

I also tried to setup a proxy:

llm:
  api_type: "ollama"  # or azure / ollama / groq etc.
  model: "llama2"  # or gpt-3.5-turbo
  base_url: "http://localhost:11434/api"  # or forward url / other llm url
  proxy: "http://localhost:8989"

But it doesn't seem to be even picked. How can i configure ollama via this wrapper url?

Environment information mac os m2 llm type -> ollama

yrobla avatar Feb 26 '25 09:02 yrobla