haystack icon indicating copy to clipboard operation
haystack copied to clipboard

Add `ChatMessage` placeholder on `ChatPromptBuilder`

Open CarlosFerLo opened this issue 1 year ago • 1 comments

Is your feature request related to a problem? Please describe. I was working on issue #7868 using alternative ways for storing messages, one based on ChatMessageStack that is simply a stack that keeps the order of messages and returns the last N messages for short-term memory, and another based on the Indexable alternative proposed in #7830 for long-term memory. Once I worked out everything I needed memory-wise, I tried to group everything in the following manner:

- System message introducing the agent objective and introducing the long-term memory messages.
<The messages retrieved from the long-term memory>
- System message introducing the last N messages.
<The messages retrieved from the short-term memory>
- System message containing the context for RAG and introducing the last query.
- Chat message containing the user query.

But I needed to create another component just to stack chat messages, and hence take advantage of how chat models are trained. I find this should be implemented directly on the ChatPromptBuilder component, as it is good practice for prompting chat models.

Describe the solution you'd like I want to implement some kind of placeholder on the ChatPromptBuilder template to be able to insert chat messages directly in there, as shown here:

template = [
    ChatMessage.from_system("You are a helpful assistant. This is your chat history:"),
    PlaceHolder("messages"),
    ChatMessage.from_system("""Respond to the following query based on the following context.
    Context:
    {% for doc in documents %}
        {{ doc.content }}
    {% endfor %}
    """),
    ChatMessage.from_user("{{ query }}")
]

prompt_builder = ChatPromptBuilder(template=template)

messages = [ # List of messages
    ChatMessage.from_user("First query"),
    ChatMessage.from_assistant("Your response")
] 
documents = [ # List of retrieved documents
    Document("Document Content")
]
result = prompt_builder(messages=messages, documents=documents, query="This is the last query")

And should output the following inside the prompt key of result:

[
    ChatMessage.from_system("You are a helpful assistant. This is your chat history:"),
    ChatMessage.from_user("First query"),
    ChatMessage.from_assistant("Your response"),
    ChatMessage.from_system("""Respond to the following query based on the following context.
    Context:
       Document Content
    """),
    ChatMessage.from_user("This is the last query")
]

Describe alternatives you've considered I considered writing my own component, but I believe this fits better inside the ChatPromptBuilder than in a new component.

CarlosFerLo avatar Jun 30 '24 10:06 CarlosFerLo

Hello @CarlosFerLo we merged a new cookbook today that shows how to use our experimental InMemoryChatMessageStore with a retriever and a writer component https://github.com/deepset-ai/haystack-cookbook/pull/108 As part of the cookbook, we're also using ChatMessages as input to a ChatPromptBuilder. Does that solve the issue you describe here?

julian-risch avatar Aug 19 '24 12:08 julian-risch