Bug: LLMConfig.extra forbidden does not allow passing LLM-specific options.
Original issue is reported in discord:
Hi Team 👋,
I was exploring the websocket streaming example and wanted to stream responses similarly.
It worked fine in AG2 v0.8.1, but now fails in v0.8.7.
✅ Works in v0.8.1
router_agent = ConversableAgent(
llm_config={
"config_list": config_list,
"stream": True,
},
name="router_agent",
...
)
❌ Fails in v0.8.7
ConversableAgent(
name=WELCOME_AGENT_NAME,
system_message=geeting_prompt,
llm_config={"config_list": config_list, "stream": True},
)
Error:
1 validation error for _LLMConfig stream Extra inputs are not permitted
🔍 Diff Observed In llm_config.py, line 84:
v0.8.1:
model_config = ConfigDict(extra="allow")
v0.8.7:
model_config = ConfigDict(extra="forbid")
Changing extra back to "allow" in v0.8.7 makes streaming work again.
❓ Questions Was there a reason for switching to extra="forbid"? Is "stream": True deprecated in newer versions? What's the correct way to stream in v0.9+? If streaming isn't viable, what’s the best way to emit output from a Swarm agent—through events or by building up context?
Thanks for the support! 🙏
Anyone have a change to fix this one? Specifically looking for token-level streaming so responses can render progressively in the UI (like how OpenAI’s API supports stream=True). Curious if that's on the roadmap or already possible.
Seems like the problem in this PR: https://github.com/ag2ai/ag2/pull/1355
@kumaranvpl can you explain please changes you made in the PR?