autogen icon indicating copy to clipboard operation
autogen copied to clipboard

Dealing with output from assistant agent

Open sridhar21111976 opened this issue 2 years ago • 17 comments

Hi Team,

Is there a way to hide the transactions between the assistant agent and the user proxy agent and get only the required final output from assistant agent, once the user proxy agent has terminated the transaction..?

I trying to use this to solve complex scenarios involving multi step, but interested only in the final answer. unless a human input is required for a given step.

sridhar21111976 avatar Oct 09 '23 08:10 sridhar21111976

There is no such fine-grained printing mechanism yet. However, one workaround is to log all the history and retrieve the conversations that satisfy certain conditions by post-processing the log history. Check code examples about logging as follows:

Enable logging: https://github.com/microsoft/autogen/blob/main/test/agentchat/test_assistant_agent.py#L122

Check the logged info: https://github.com/microsoft/autogen/blob/main/test/agentchat/test_assistant_agent.py#L150

Please let me know if this does not address your needs.

qingyun-wu avatar Oct 10 '23 01:10 qingyun-wu

Hi Qingyun-wu

The idea was to avoid all the intermedite outputs. So logging and parsing through the entire content gets to become complex. What might be useful is like a Verbose turn off and only get the final output post the TERMINATE exchange.

Also I am seeing the agent often does not recognise an existing function and says function does not exist - thought it identifies the right function needed. This is intermittent. Probably to do with some cache issue... not sure..

Also is there an option to flush cache, will be nice to understand what is level of informatin is cached.

sridhar21111976 avatar Oct 10 '23 22:10 sridhar21111976

https://microsoft.github.io/autogen/docs/reference/agentchat/conversable_agent#initiate_chat set silent=True to skip printing. Get the chat messages or last message

clear cache: https://microsoft.github.io/autogen/docs/reference/oai/completion#clear_cache cache is made per ChatCompletion.create request.

sonichi avatar Oct 10 '23 22:10 sonichi

Thank you Sonichi, Will give that a try.. Any sampe code is appreciated.. This is great stuff team... I have been doing a LLM to LLM talk to achieve this so far... this is making it simple.... Bit more stability on function recognition and respecting the description text is needed...It sometimes ignores the text in the definition.

sridhar21111976 avatar Oct 11 '23 04:10 sridhar21111976

Some of the answers are worth adding to the documentation website.

sonichi avatar Oct 22 '23 16:10 sonichi

Hi Sonichi,

I tried the last message option. Given the last interaction is a TERMINATE command to end conversation, the last message printed is TERMINATE. I have just worked around by asking the agent to format the final answer thru prompt as something like {answer} TERMINATE. Then I am trimming TERMINATE word from the final answer.... Is there any other elegant way to do this..?

Also have question around memory - what is the default chat history / memory length, any option to control or reset this.. I can see chat_history parameter - Boolean - is this the only option...?

Also if I build an application using Autogen - Is this expected to remain for ever and be supported..?

sridhar21111976 avatar Oct 25 '23 03:10 sridhar21111976

The chat history keeps growing in memory until https://microsoft.github.io/autogen/docs/reference/agentchat/conversable_agent#clear_history

No one can promise forever but look, autogen is known by the public for only 1 month and already has a big vibrant community. It's not going to die anytime soon.

sonichi avatar Oct 28 '23 15:10 sonichi

Hi Sonichi,

I tried the last message option. Given the last interaction is a TERMINATE command to end conversation, the last message printed is TERMINATE. I have just worked around by asking the agent to format the final answer thru prompt as something like {answer} TERMINATE. Then I am trimming TERMINATE word from the final answer.... Is there any other elegant way to do this..?

Also have question around memory - what is the default chat history / memory length, any option to control or reset this.. I can see chat_history parameter - Boolean - is this the only option...?

Also if I build an application using Autogen - Is this expected to remain for ever and be supported..?

I still haven't found a way to not get any output from initiate_chat(), but it seems individual messages from the assistants can be returned with print(list(assistant._oai_messages.values())[0][-3]['content']) For me it was -3 because -1 was 'TERMINATE' and -2 was blank, but it could vary depending on the output you get.

karlo-franic avatar Dec 07 '23 13:12 karlo-franic

I am trying to get the last message and for context, have called something like:

user_proxy.initiate_chat(group_chat_manager, message="This is my message")

This works and I get the full conversation. I'm trying to use the last_message() function, but when I do something like,

group_chat_manager.last_message()

It says group_chat_manager was not part of any recent conversations. I tried calling last_message with other agents that was part of the groupchat and I get the same error. Can someone please provide actual use of the function, linking to the documentation is not clear enough.

ksivakumar avatar Feb 22 '24 03:02 ksivakumar

silent = True only works for the first message after that it again prints the messages

objecthuman avatar Apr 16 '24 11:04 objecthuman

silent = True only works for the first message after that it again prints the messages

That sounds a bug. Help is appreciated!

cc @cheng-tan @Hk669 @giorgossideris @krishnashed @WaelKarkoub

sonichi avatar Apr 16 '24 14:04 sonichi

I am trying to get the last message and for context, have called something like:

user_proxy.initiate_chat(group_chat_manager, message="This is my message")

This works and I get the full conversation. I'm trying to use the last_message() function, but when I do something like,

group_chat_manager.last_message()

It says group_chat_manager was not part of any recent conversations. I tried calling last_message with other agents that was part of the groupchat and I get the same error. Can someone please provide actual use of the function, linking to the documentation is not clear enough.

Could you try the new "ChatResult" returned from the chat? https://microsoft.github.io/autogen/docs/tutorial/conversation-patterns#group-chat

sonichi avatar Apr 16 '24 14:04 sonichi

@Neeraj319 would you be able to open up a new issue? And if possible include a minimal example to reproduce it

WaelKarkoub avatar Apr 16 '24 14:04 WaelKarkoub

silent = True only works for the first message after that it again prints the messages

That sounds a bug. Help is appreciated!

cc @cheng-tan @Hk669 @giorgossideris @krishnashed @WaelKarkoub

Sure @sonichi, let me check it out!

krishnashed avatar Apr 16 '24 14:04 krishnashed

I have opened an issue: #2402

objecthuman avatar Apr 17 '24 02:04 objecthuman

Not sure if this can help, but if you want to completely turn of the output by creating a silent console, something like this

class SilentConsole(console.IOConsole):
    def print(self, *objects: Any, sep: str = " ", end: str = "\n", flush: bool = False) -> None:
        pass

and set it as the default

base.IOStream.set_global_default(SilentConsole())
base.IOStream.set_default(SilentConsole())

yonitjio avatar May 24 '24 06:05 yonitjio

Would be a useful feature if we can set the verbosity level for each agent in a group chat. As you can imagine, some parts of the conversation are not relevant/useful/interesting to the end user, and so we might not want to show it to them. As the conversation gets longer, the amount of stuff shown on a UI also gets longer, so being able to hide them will improve the end-user experience

teyang-lau avatar Jul 29 '24 08:07 teyang-lau

several proposed solutions in the issue, and other relevant issues opened. marking won't fix for 0.2

rysweet avatar Oct 12 '24 02:10 rysweet

generate_reply() will produce the last message

jacobodetunde avatar Dec 11 '24 17:12 jacobodetunde