ag2 icon indicating copy to clipboard operation
ag2 copied to clipboard

Bug: LLMConfig.extra forbidden does not allow passing LLM-specific options.

Open harishmohanraj opened this issue 7 months ago • 3 comments

Original issue is reported in discord:


Hi Team 👋,

I was exploring the websocket streaming example and wanted to stream responses similarly.

It worked fine in AG2 v0.8.1, but now fails in v0.8.7.


✅ Works in v0.8.1

router_agent = ConversableAgent(
    llm_config={
        "config_list": config_list,
        "stream": True,
    },
    name="router_agent",
    ...
)

❌ Fails in v0.8.7

ConversableAgent(
    name=WELCOME_AGENT_NAME,
    system_message=geeting_prompt,
    llm_config={"config_list": config_list, "stream": True},
)

Error:

1 validation error for _LLMConfig stream Extra inputs are not permitted


🔍 Diff Observed In llm_config.py, line 84:

v0.8.1:

model_config = ConfigDict(extra="allow")

v0.8.7:

model_config = ConfigDict(extra="forbid")

Changing extra back to "allow" in v0.8.7 makes streaming work again.


❓ Questions Was there a reason for switching to extra="forbid"? Is "stream": True deprecated in newer versions? What's the correct way to stream in v0.9+? If streaming isn't viable, what’s the best way to emit output from a Swarm agent—through events or by building up context?

Thanks for the support! 🙏

harishmohanraj avatar May 06 '25 15:05 harishmohanraj

Anyone have a change to fix this one? Specifically looking for token-level streaming so responses can render progressively in the UI (like how OpenAI’s API supports stream=True). Curious if that's on the roadmap or already possible.

elCaptnCode avatar Jul 31 '25 02:07 elCaptnCode

Seems like the problem in this PR: https://github.com/ag2ai/ag2/pull/1355

Lancetnik avatar Jul 31 '25 20:07 Lancetnik

@kumaranvpl can you explain please changes you made in the PR?

Lancetnik avatar Aug 06 '25 20:08 Lancetnik