open_deep_research
open_deep_research copied to clipboard
[Feature request]Support configureble chat_model
It would be more versatile and better if we could set chat_model as configurable so that we could freely change chat_model-specific parameters (such as extended shinking in Claude 3.7 Sonnet or reasoning_effort in o3-mini). What do you think?
For example:
thread = {"configurable": {"thread_id": str(uuid.uuid4()),
"search_api": "tavily",
"planner_chat_model": ChatOpenAI(model="o3-mini"),
"writer_chat_model": ChatAnthropic(model="claude-3-7-sonnet-latest", thinking={"type": "enabled", "budget_tokens": 16_000}),
"report_structure": REPORT_STRUCTURE,
"max_search_depth": 1,}
}
The default behavior is as it is now, and I think it would be good to prepare it as an advanced setting.