NeMo-Guardrails icon indicating copy to clipboard operation
NeMo-Guardrails copied to clipboard

How to pass the history of messages to actions?

Open cmpeburak opened this issue 1 year ago • 8 comments

Hi team,

In the custom action code that we write, we need a complete list of messages from a chat conversation. However, when I examine the three arguments passed to the action (llm_task_manager, context, and llm), none of them contains the full chat history. When I debug the code, I notice that the messages are present in the events dictionary. However, the context object only contains the most recent messages. Should these messages be made accessible within the actions, or should we implement a memory system to store them?

cmpeburak avatar Dec 19 '23 08:12 cmpeburak

Hi @cmpeburak!

If an action needs access to the entire history, adding an events: List[dict] parameter is the way to go. To transform the complete list of events into a sequence of user/assistant messages can be done similarly to: https://github.com/NVIDIA/NeMo-Guardrails/blob/develop/nemoguardrails/llm/filters.py#L90.

I think also exposing a messages parameter directly could be a good addition to the python API for actions.

drazvan avatar Dec 19 '23 21:12 drazvan

Thanks @drazvan, user_assistant_sequence filter will help us to go forward.

Also, +1 for adding messages parameter to the Python API for actions.

cmpeburak avatar Dec 19 '23 22:12 cmpeburak

I'm using a slightly changed version of user_assistant_sequence to return the messages in the OpenAI input format. Also, event["type"] == "UserMessage" check does not filter all user messages from the events.

def filter_messages_from_events(events: List[dict]) -> List:
    history = []
    for event in events:
        if event["type"] == "UtteranceUserActionFinished":
            history.append({"role": "user", "content": event["final_transcript"]})
        elif event["type"] == "StartUtteranceBotAction":
            history.append({"role": "assistant", "content": event["script"]})
    return history

cmpeburak avatar Dec 19 '23 23:12 cmpeburak

One additional question @drazvan:

This is how I generate completion with a list of messages history.

async def _generate_async(prompt=None, messages=None):
    result = await rails.generate_async(prompt=prompt, messages=messages)
    return result


async def main():
    result = await _generate_async(
        messages=[
            {"role": "system", "content": DISCOVERY_CHAT},
            {"role": "user", "content": "hi"},
            {"role": "assistant", "content": "hi, how may I help you?"},
            {"role": "user", "content": "I want to learn machine learning."},
            {"role": "assistant", "content": "here are some popular machine learning topics: deep learning and data science."},
            {"role": "user", "content": "tell me more about data science."},
        ],
    )
    print(result)

When I check the events argument passed to the action I do not see the system prompt. How can I access it? I also need it to get the complete chat history in the action.

cmpeburak avatar Dec 19 '23 23:12 cmpeburak

Currently, the system prompt is not forwarded as part of the chat history. If this is still relevant, I can point you to a work around using a context variable.

drazvan avatar Jan 24 '24 01:01 drazvan

Currently, the system prompt is not forwarded as part of the chat history. If this is still relevant, I can point you to a work around using a context variable.

Hi @drazvan , yes please.

cmpeburak avatar Jan 24 '24 09:01 cmpeburak

Currently, the system prompt is not forwarded as part of the chat history. If this is still relevant, I can point you to a work around using a context variable.

Hi @drazvan , can you help me on that one? We still need to add the system prompt to the chat history somehow. Thank you.

cmpeburak avatar Mar 14 '24 15:03 cmpeburak

@cmpeburak: you can use role set to context, instead of system as shown below:

async def main():
    result = await _generate_async(
        messages=[
            {"role": "context", "content": {"system_prompt": DISCOVERY_CHAT}},
            {"role": "user", "content": "hi"},
            {"role": "assistant", "content": "hi, how may I help you?"},
            {"role": "user", "content": "I want to learn machine learning."},
            {"role": "assistant", "content": "here are some popular machine learning topics: deep learning and data science."},
            {"role": "user", "content": "tell me more about data science."},
        ],
    )
    print(result)

this will set a context variable which you can include in your prompt template using {{ system_prompt }}.

drazvan avatar Mar 20 '24 21:03 drazvan