SongChiyoung
SongChiyoung
```python client1 = OllamaChatCompletionClient( model="llama3.1:latest", host="127.0.0.1:11435", api_key="ollama", ) client2 = OpenAIChatCompletionClient( model="llama3.1:latest", base_url="http://127.0.0.1:11435/v1", api_key="ollama", model_info={ "vision": False, "function_calling": True, "json_output": True, "family": "unknown", }, ) messages = [ UserMessage(content="hello", source="user"),...
If you could solving your whole isssue. Please close that issue. Cause it is not the bug.
How about other function call (tool) is work? without MCP?
```python import asyncio from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.teams import RoundRobinGroupChat from autogen_agentchat.conditions import TextMentionTermination from autogen_core import CancellationToken from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.models.ollama...
@xsw1006931693 Hi, I was wondering... could this issue possibly be related to #6198? I couldn’t dig deeper into that one due to the lack of a reproducible code snippet, but...
@ekzhu Thank you for information. You're absolutely right — this issue can definitely be addressed via the modular transformer system in #6063. To keep things more maintainable though, how about...
Okay.. I'll resolve it. I think I know what's the problem.
@sasan-hashemi I see... haha. I know what is this issue and resolved it.
Hi everyone, I’d appreciate your help on this issue — please consider filling out the survey above! If you have any confirmed findings or model-specific behavior to share, feel free...
> Let's make an optional field and by default set it to `False`? Yes, that's what I want.