mcp-agent icon indicating copy to clipboard operation
mcp-agent copied to clipboard

mcp_basic_ollama_agent example isn't running, because of invalid API calls?

Open builder33807664 opened this issue 9 months ago • 4 comments

Hi, I am surprised by the implementation of the Ollama example. Because the class and conf. for 'OpenAIAugmentedLLM' is used here ([...] This implementation uses OpenAI's ChatCompletion as the LLM). As a result, the wrong API endpoints are called, such as http://localhost:11434/chat/completions, which does not match the Ollama API (see https://github.com/ollama/ollama/blob/main/docs/api.mdi)."

How I get the example running (with my local ollama model (llama3.2:1b))?

Regards, Martin

builder33807664 avatar Mar 14 '25 13:03 builder33807664

openai:
  base_url: "http://localhost:11434/v1"

please check the base_url of mcp_basic_ollama_agent\mcp_agent.config.yaml. the value in the example seems correct.

lij55 avatar Mar 19 '25 13:03 lij55

@builder33807664 can you confirm if you were able to get this working? For some context, the ollama docs suggest to use the base_url as http://localhost:11434/v1. See https://github.com/ollama/ollama/blob/main/docs/openai.md.

I did notice there is a small issue when no api key is specified, so I've just pushed a fix to the example to specify an api key. Please let me know if that was the issue you were encountering. I've confirmed the example works.

Make sure you ran ollama run llama3.2:1b before starting the example, since the ollama server needs to be running.

saqadri avatar Mar 25 '25 03:03 saqadri

I got this one

[DEBUG] 2025-03-27T11:04:05 mcp_agent.workflows.llm.augmented_llm_openai.finder - {'model': 'llama3.2:3b', 'messages': [{'role': 'system', 'content': "You are an agent with access to the filesystem, \n ...... [DEBUG] 2025-03-27T11:04:05 mcp_agent.workflows.llm.augmented_llm_openai.finder - Chat in progress { "data": { "progress_action": "Chatting", "model": "llama3.2:3b", "agent_name": "finder", "chat_turn": 3 } } [mcp_agent.workflows.llm.augmented_llm_openai.finder] Error: DNS lookup failed

Do you know why? @saqadri

And I can get the response from direct API call using Bruno(Postman like API tool)

Image

xianshenglu avatar Mar 27 '25 03:03 xianshenglu

@builder33807664 can you confirm if you were able to get this working? For some context, the ollama docs suggest to use the base_url as http://localhost:11434/v1. See https://github.com/ollama/ollama/blob/main/docs/openai.md.

I did notice there is a small issue when no api key is specified, so I've just pushed a fix to the example to specify an api key. Please let me know if that was the issue you were encountering. I've confirmed the example works.

Make sure you ran ollama run llama3.2:1b before starting the example, since the ollama server needs to be running.

@saqadri it works! Thank's a lot for your support.

builder33807664 avatar Mar 29 '25 11:03 builder33807664