langchain icon indicating copy to clipboard operation
langchain copied to clipboard

how to use SystemMessagePromptTemplate with ConversationSummaryBufferMemory please?

Open Chauban opened this issue 2 years ago • 2 comments

I try to set the "system" role maessage when using ConversationChain with ConversationSummaryBufferMemory(CSBM), but it is failed. When I change the ConversationSummaryBufferMemory to the ConversationBufferMemory, it become worked. But I'd like to use the auto summarize utilities when exceeding the maxLength by CSBM.

Below is the error message:

chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, MessagesPlaceholder(variable_name="history"),human_message_prompt])

from langchain.chains import ConversationChain conversation_with_summary = ConversationChain( llm=chat, memory=ConversationSummaryBufferMemory(llm=chat, max_token_limit=10), prompt=chat_prompt, verbose=True )

conversation_with_summary.predict(input="hello")


Entering new ConversationChain chain...


ValueError Traceback (most recent call last) Cell In[125], line 34 24 conversation_with_summary = ConversationChain( 25 #llm=llm, 26 llm=chat, (...) 31 verbose=True 32 ) 33 #conversation_with_summary.predict(identity="佛祖",text="你好") ---> 34 conversation_with_summary.predict(input="你好")

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py:151, in LLMChain.predict(self, **kwargs) 137 def predict(self, **kwargs: Any) -> str: 138 """Format prompt with kwargs and pass to LLM. 139 140 Args: (...) 149 completion = llm.predict(adjective="funny") 150 """ --> 151 return self(kwargs)[self.output_key]

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py:116, in Chain.call(self, inputs, return_only_outputs) 114 except (KeyboardInterrupt, Exception) as e: 115 self.callback_manager.on_chain_error(e, verbose=self.verbose) --> 116 raise e 117 self.callback_manager.on_chain_end(outputs, verbose=self.verbose) 118 return self.prep_outputs(inputs, outputs, return_only_outputs)

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\base.py:113, in Chain.call(self, inputs, return_only_outputs) 107 self.callback_manager.on_chain_start( 108 {"name": self.class.name}, 109 inputs, 110 verbose=self.verbose, 111 ) 112 try: --> 113 outputs = self._call(inputs) 114 except (KeyboardInterrupt, Exception) as e: 115 self.callback_manager.on_chain_error(e, verbose=self.verbose)

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py:57, in LLMChain._call(self, inputs) 56 def _call(self, inputs: Dict[str, Any]) -> Dict[str, str]: ---> 57 return self.apply([inputs])[0]

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py:118, in LLMChain.apply(self, input_list) 116 def apply(self, input_list: List[Dict[str, Any]]) -> List[Dict[str, str]]: 117 """Utilize the LLM generate method for speed gains.""" --> 118 response = self.generate(input_list) 119 return self.create_outputs(response)

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py:61, in LLMChain.generate(self, input_list) 59 def generate(self, input_list: List[Dict[str, Any]]) -> LLMResult: 60 """Generate LLM result from inputs.""" ---> 61 prompts, stop = self.prep_prompts(input_list) 62 return self.llm.generate_prompt(prompts, stop)

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\chains\llm.py:79, in LLMChain.prep_prompts(self, input_list) 77 for inputs in input_list: 78 selected_inputs = {k: inputs[k] for k in self.prompt.input_variables} ---> 79 prompt = self.prompt.format_prompt(**selected_inputs) 80 _colored_text = get_colored_text(prompt.to_string(), "green") 81 _text = "Prompt after formatting:\n" + _colored_text

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\prompts\chat.py:173, in ChatPromptTemplate.format_prompt(self, **kwargs) 167 elif isinstance(message_template, BaseMessagePromptTemplate): 168 rel_params = { 169 k: v 170 for k, v in kwargs.items() 171 if k in message_template.input_variables 172 } --> 173 message = message_template.format_messages(**rel_params) 174 result.extend(message) 175 else:

File ~\AppData\Local\Programs\Python\Python310\lib\site-packages\langchain\prompts\chat.py:43, in MessagesPlaceholder.format_messages(self, **kwargs) 41 value = kwargs[self.variable_name] 42 if not isinstance(value, list): ---> 43 raise ValueError( 44 f"variable {self.variable_name} should be a list of base messages, " 45 f"got {value}" 46 ) 47 for v in value: 48 if not isinstance(v, BaseMessage):

ValueError: variable history should be a list of base messages, got

Chauban avatar Mar 24 '23 16:03 Chauban

It seems that calling ConversationSummaryBufferMemory with return_messages set to True works well. Please try the following:

memory = ConversationSummaryBufferMemory(llm=chat, max_token_limit=10, return_messages=True)

Aratako avatar Mar 25 '23 00:03 Aratako

It works well, thanks!

Chauban avatar Mar 25 '23 08:03 Chauban

It seems that calling ConversationSummaryBufferMemory with return_messages set to True works well. Please try the following:

memory = ConversationSummaryBufferMemory(llm=chat, max_token_limit=10, return_messages=True)

I am also getting that error eventhough I am using 'return_messages=True'

https://gist.github.com/RageshAntony/793b76c68a0fea036d337305927643af

RageshAntony avatar May 26 '23 07:05 RageshAntony

Hi, @Chauban! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you reported an issue regarding the ConversationSummaryBufferMemory and ConversationChain. Aratako suggested a workaround by setting return_messages to True in the ConversationSummaryBufferMemory constructor, which seemed to work for you. However, RageshAntony mentioned that they are still experiencing the error even with return_messages=True.

Before we close this issue, we wanted to check with you if this issue is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on this issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!

dosubot[bot] avatar Sep 21 '23 16:09 dosubot[bot]