autogen icon indicating copy to clipboard operation
autogen copied to clipboard

Discussion about nested_chats design

Open lfygh opened this issue 1 year ago • 1 comments

Discussed in https://github.com/microsoft/autogen/discussions/3292

Originally posted by lfygh August 5, 2024 The document describes that nested chat is a powerful conversation pattern that allows you to package complex workflows into a single agent.

I'm a bit confused about the nested chat design. The nested chat is implemented by registering the reply_func_from_nested_chats method. In this method, only the last_msg is used to initialize a new chat, causing the loss of other chat history. I feel that this is not consistent with the design of the generate_reply method, which accepts a message list.

def _summary_from_nested_chats(
        chat_queue: List[Dict[str, Any]], recipient: Agent, messages: Union[str, Callable], sender: Agent, config: Any
    ) -> Tuple[bool, str]:
        last_msg = messages[-1].get("content")
        chat_to_run = []
        for i, c in enumerate(chat_queue):
            current_c = c.copy()
            if current_c.get("sender") is None:
                current_c["sender"] = recipient
            message = current_c.get("message")
            # If message is not provided in chat_queue, we by default use the last message from the original chat history as the first message in this nested chat (for the first chat in the chat queue).
            # NOTE: This setting is prone to change.
            if message is None and i == 0:
                message = last_msg
            if callable(message):
                message = message(recipient, messages, sender, config)
            # We only run chat that has a valid message. NOTE: This is prone to change dependin on applications.
            if message:
                current_c["message"] = message
                chat_to_run.append(current_c)
        if not chat_to_run:
            return True, None
        res = initiate_chats(chat_to_run)
        return True, res[-1].summary

If I want to inject some context before the conversation, it won't work.

llm_config = {
        "config_list": [{"model": "gpt-4o-mini", "api_key": os.environ["OPENAI_API_KEY"]}]}
    agent = ConversableAgent(
        name="Agent", llm_config=llm_config, human_input_mode="NEVER",
    )

    agent1 = ConversableAgent(
        name="Agent1",llm_config=llm_config,human_input_mode="NEVER",
    )

    agent.register_nested_chats(
        [{ "recipient": agent1,"max_turns": 1 }],
        trigger=lambda sender: sender not in [agent1, agent2],
    )

    r = agent.generate_reply([
        { "role": "user","content": "hi,what's your name?"},
        {"role": "system","content": "my name is agent"},
        {"role": "user","content": "what's your name?"},
    ])
    print(r)
********************************************************************************
Agent (to Agent1):

what's your name?

--------------------------------------------------------------------------------
Agent1 (to Agent):

I'm called Assistant! How can I help you today?

_summary_from_nested_chats should handle the message list, right? or something i'm wrong?

lfygh avatar Aug 07 '24 01:08 lfygh

Strongly Agree! And I open an new issue#issue

smallQQ0227 avatar Nov 11 '24 09:11 smallQQ0227

Latest update from v0.4 preview: the SocietyofMindAgent is a built-in approach to nested chat.

https://microsoft.github.io/autogen/dev/reference/python/autogen_agentchat.agents.html#autogen_agentchat.agents.SocietyOfMindAgent

You can refer to its implementation and implement your own nested chat agent using custom agent:

https://microsoft.github.io/autogen/dev/user-guide/agentchat-user-guide/tutorial/custom-agents.html

Will be released in the upcoming dev12 release.

ekzhu avatar Dec 15 '24 06:12 ekzhu