langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Issues with leading line breaks in conversational agent - possible solution?

Open ColinTitahi opened this issue 2 years ago • 2 comments

When the LLM returns a leading line break an error is returned Could not parse LLM output from /agents/conversational/base.py I am using the conversational-react-description agent

This can be reliably replicated by asking "Write three lines with line breaks" Note the return does not have a space after the initial AI: This is causing the issue

Thought: Do I need to use a tool? No AI: Line one

Line two

Line three

The problem appears to be line 78 if f"{self.ai_prefix}: " in llm_output:

Where a space after the ai_prefix is expected. With the above example there is no space and consequently it fails. I have tried this solution below which simply adds the space if the prefix is truncated by a new line

def _extract_tool_and_input(self, llm_output: str) -> Optional[Tuple[str, str]]: # New line to add a space after prefix llm_output = llm_output.replace(f"{self.ai_prefix}:\n", f"{self.ai_prefix}: \n")

ColinTitahi avatar Jan 30 '23 19:01 ColinTitahi

good catch - will fix

hwchase17 avatar Jan 31 '23 07:01 hwchase17

Awesome work BTW @hwchase17

ColinTitahi avatar Jan 31 '23 10:01 ColinTitahi

should be fixed now! thanks for flagging

hwchase17 avatar Feb 03 '23 04:02 hwchase17