langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Any ideas on making input and output languages consistent?

Open realgump opened this issue 2 years ago • 3 comments

Issue you'd like to raise.

When I using agent with a llm of gpt-3.5 and a search tool of google, the AI's response is always in English, regardless of my input being in Chinese. Are there any ideas on how to ensure that the input and output languages are consistent?

Suggestion:

No response

realgump avatar May 12 '23 04:05 realgump

I guess you could explicitly ask the llm (in prompt) to answer in the same lang as input?

andreakiro avatar May 12 '23 07:05 andreakiro

@andreakiro It doesnt work with agents because they change the question to english in action_input and response is always in english.

KeshavSingh29 avatar May 12 '23 08:05 KeshavSingh29

I have the same problem using agent=ConversationalAgent, llm=OpenAI and a set of tools. I confirm the Agent always translate the question to english and puts it in the action_input thus the output answer is in English. Is this the expected behavior of the agents or is it a just bug? Is there any workaround? Thanks in advance for your suggestions

oskrocha avatar May 26 '23 15:05 oskrocha

I have the same problem using agent=ConversationalAgent, llm=OpenAI and a set of tools. I confirm the Agent always translate the question to english and puts it in the action_input thus the output answer is in English. Is this the expected behavior of the agents or is it a just bug? Is there any workaround? Thanks in advance for your suggestions

This question only arises when I use GPT-3.5, not GPT-3. I have debugged the code step by step and identified a possible reason. When using GPT-3.5, some tool descriptions are sent after the user's input and are not shown in the final response. However, when using GPT-3, all tool descriptions are requested together with the user's input. Therefore, I had to add my language command in the prompt, and it worked. I hope this information is helpful to you.

realgump avatar Jun 05 '23 06:06 realgump

Here is a solution that works pretty well (tested on multiple languages as input) I have only tested it for conversational-react-description agent type, but it requires you to tweak the tool as well as agent prompt.

# For Tool have a pre-defined format
class KnowledgeTool(BaseTool):
    request_format = '{{"USER": "<input_question>"}}'
    name = 'Knowledge Tool'
    description = f"""
    Tool to answer something knowledgeable. Input should be JSON in the following format: {request_format}
        """
    return_direct = False
    
    # define your tool methods etc.....

# Initialize agent 
conversational_agent = initialize_agent(
        agent='conversational-react-description',
        tools=[KnowledgeTool()])

# You can identify the language with any modules like lingua or fasttext       
prompt_prefix = f"""<Your description similar to system message> Use only {language} language to reply"""

# Now the magic part 
conversational_agent.agent.llm_chain.prompt = prompt_prefix

Let me know if doesnt work, I can help.

KeshavSingh29 avatar Jun 06 '23 03:06 KeshavSingh29

Here is a solution that works pretty well (tested on multiple languages as input) I have only tested it for conversational-react-description agent type, but it requires you to tweak the tool as well as agent prompt.

# For Tool have a pre-defined format
class KnowledgeTool(BaseTool):
    request_format = '{{"USER": "<input_question>"}}'
    name = 'Knowledge Tool'
    description = f"""
    Tool to answer something knowledgeable. Input should be JSON in the following format: {request_format}
        """
    return_direct = False
    
    # define your tool methods etc.....

# Initialize agent 
conversational_agent = initialize_agent(
        agent='conversational-react-description',
        tools=[KnowledgeTool()])

# You can identify the language with any modules like lingua or fasttext       
prompt_prefix = f"""<Your description similar to system message> Use only {language} language to reply"""

# Now the magic part 
conversational_agent.agent.llm_chain.prompt = prompt_prefix

Let me know if doesnt work, I can help.

Hi, @KeshavSingh29 Thank you for sharing your code. I gave it a try, but it showed me this error:

File [/Applications/anaconda3/envs/testing_chat/lib/python3.9/site-packages/langchain/chains/base.py:239](https://file+.vscode-resource.vscode-cdn.net/Applications/anaconda3/envs/testing_chat/lib/python3.9/site-packages/langchain/chains/base.py:239), in Chain.run(self, callbacks, *args, **kwargs)
    236     return self(args[0], callbacks=callbacks)[self.output_keys[0]]
    238 if kwargs and not args:
--> 239     return self(kwargs, callbacks=callbacks)[self.output_keys[0]]
    241 if not kwargs and not args:
    242     raise ValueError(
    243         "`run` supported with either positional arguments or keyword arguments,"
    244         " but none were provided."
    245     )

File [/Applications/anaconda3/envs/testing_chat/lib/python3.9/site-packages/langchain/chains/base.py:123](https://file+.vscode-resource.vscode-cdn.net/Applications/anaconda3/envs/testing_chat/lib/python3.9/site-packages/langchain/chains/base.py:123), in Chain.__call__(self, inputs, return_only_outputs, callbacks)
    106 def __call__(
    107     self,
    108     inputs: Union[Dict[str, Any], Any],
    109     return_only_outputs: bool = False,
    110     callbacks: Callbacks = None,
    111 ) -> Dict[str, Any]:
    112     """Run the logic of this chain and add to output if desired.
    113 
    114     Args:
...
     52     :meta private:
     53     """
---> 54     return self.prompt.input_variables

AttributeError: 'str' object has no attribute 'input_variables'

My code was:

agent_chain = initialize_agent(tools, fast_chat, agent=AgentType.CHAT_CONVERSATIONAL_REACT_DESCRIPTION, verbose=True, memory=memory)
agent_chain.agent.llm_chain.prompt = prompt_prefix
agent_chain.run(input=some_input)

Could you please take a look?

PeterTF656 avatar Jun 06 '23 10:06 PeterTF656

@PeterTF656 Like the error says: "run supported with either positional arguments or keyword arguments," 244 " but none were provided." Can you show how you set up the memory? Seems like the keys are not setup.

KeshavSingh29 avatar Jun 07 '23 03:06 KeshavSingh29

@PeterTF656 Like the error says: "run supported with either positional arguments or keyword arguments," 244 " but none were provided." Can you show how you set up the memory? Seems like the keys are not setup.

@KeshavSingh29 Thank you for your swift response! I set up memory the normal way: memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)

PeterTF656 avatar Jun 08 '23 15:06 PeterTF656

@PeterTF656 Thanks for reply. Unfortunately, I cant replicate it. Can you share the working code to replicate the issue? And what exactly is fast_chat

KeshavSingh29 avatar Jun 12 '23 01:06 KeshavSingh29

I have the same problem using agent=ConversationalAgent, llm=OpenAI and a set of tools. I confirm the Agent always translate the question to english and puts it in the action_input thus the output answer is in English. Is this the expected behavior of the agents or is it a just bug? Is there any workaround? Thanks in advance for your suggestions

This question only arises when I use GPT-3.5, not GPT-3. I have debugged the code step by step and identified a possible reason. When using GPT-3.5, some tool descriptions are sent after the user's input and are not shown in the final response. However, when using GPT-3, all tool descriptions are requested together with the user's input. Therefore, I had to add my language command in the prompt, and it worked. I hope this information is helpful to you.

@realgump Could you show a code snippet that used to solve the problem?

MrRace avatar Jul 06 '23 08:07 MrRace

Hi, @realgump! I'm Dosu, and I'm here to help the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, the issue you raised is about ensuring consistent input and output languages when using OpenAI's GPT-3.5 model with a search tool. There have been some discussions in the comments about possible solutions, such as adding language commands in the prompt or tweaking the tool and agent prompt. One user even shared a code snippet that worked for them. However, it seems that the issue remains unresolved, as there was a user who encountered an error and requested further assistance.

Before we close this issue, we wanted to check with you if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself, or it will be automatically closed in 7 days.

Thank you for your understanding and contribution to the LangChain project!

dosubot[bot] avatar Oct 12 '23 16:10 dosubot[bot]

thanks! I can work like this.

language="Chinese"

# Finally, let's initialize an agent with the tools, the language model, and the type of agent we want to use.
agent = initialize_agent(tools, llm, agent=AgentType.ZERO_SHOT_REACT_DESCRIPTION, verbose=True)

# You can identify the language with any modules like lingua or fasttext       
prompt_prefix = f"""<Your description similar to system message> Use only {language} language to reply:"""

# Now the magic part 
agent.agent.llm_chain.prompt.template = prompt_prefix + agent.agent.llm_chain.prompt.template

# Now let's test it out!
agent.invoke(input ="答案用中文回答:最流行的编程语言有哪些,列出最流行的5个?")

wnose avatar Jan 19 '24 08:01 wnose