langchain icon indicating copy to clipboard operation
langchain copied to clipboard

add add_system_message method to ChatMessageHistory

Open mziru opened this issue 1 year ago • 2 comments

I added an add_system_message method to ChatMessageHistory (the default implementation of BaseChatMessageHistory). Should pass linting this time!

Seems useful e.g. to be able to initialize a conversational agent with a system message in memory.

Though is there a reason I'm not thinking of that this wasn't already implemented, i.e. to go along with add_user_message and add_ai_message? (Even if not all models have this feature)

If not, should this be added as an abstract method to the base class and implemented elsewhere, e.g. for CosmosDB and DynamoDB?

mziru avatar Apr 29 '23 21:04 mziru

completely agree it can be useful to have a system message in the prompt, but thats more the role of the prompt itself rather than the chat history

hwchase17 avatar May 01 '23 23:05 hwchase17

completely agree it can be useful to have a system message in the prompt, but thats more the role of the prompt itself rather than the chat history

yes that definitely makes sense. i was running into validation errors specifically around using system messages with the chat-conversational-react-description agent, which expects a memory component. was able to resolve these by adding the system message to the memory before instantiating the conversation agent. but i'll take a closer look to see if i can resolve these with the prompt itself rather than messing with the chat history (and i'll follow up if not).

thanks for the feedback.

mziru avatar May 05 '23 19:05 mziru

sounds like this isn't exactly the solution we want, closing for now but let me know if I'm missing something!

baskaryan avatar Aug 11 '23 21:08 baskaryan

I have found it useful to append a system message to the end of the chat history for gpt-3.5-turbo which even after OpenAI's upgrade still has trouble paying attention to the system message sometimes. For example I have this line after I save_content with the ConversationBufferWindowMemory

memory.chat_memory.add_message(SystemMessage(content="Please format your output in json format with the following fields...."))

I find this gets 3.5 to be more reliable, even though I'm including the same directive in my prompt.

thejollyrogers avatar Aug 14 '23 17:08 thejollyrogers