langchain icon indicating copy to clipboard operation
langchain copied to clipboard

Issue: All types of langchain memories don't work in a proper way.

Open Eng-ZeyadTarek opened this issue 1 year ago • 1 comments

I've built many langchain agents with many types of memory to remember the context of the context in a correct way.

The problem is if I send the same message again, he will generate the same thought and answer again, which I don't refer to.

Example of the incorrect conversation that would happen between me and the agent when using GPTIndexChatMemory

User:- Hi

AI:- Hi, how can I help you today?

User: What's machine learning?

AI:- Machine learning is a field of artificial intelligence (AI) that focuses on developing algorithms and models that enable computers to learn from data and make predictions or decisions without being explicitly programmed. The primary goal of machine learning is to enable computers to automatically learn and improve from experience or examples. (good answer referring to my previous question)

User:- I didn't get it.

AI:- I apologize if my previous response was not clear. Let me provide a simpler explanation: Machine learning is a branch of artificial intelligence (AI) that focuses on teaching computers to learn and make decisions or predictions without being explicitly programmed for each specific task. (good answer referring to my previous message and he knew that I didn't the definition of ML)

User:- what's the physics?

AI:- Physics is a natural science that studies the fundamental principles and laws governing the physical world. It seeks to understand the behavior of matter, energy, and the interactions between them. (good answer referring to my previous message)

User:- I didn't get it. (same message again which saved in the previous chat history)

AI:- I apologize if my previous response was not clear. Let me provide a simpler explanation: Machine learning is a branch of artificial intelligence (AI) that focuses on teaching computers to learn and make decisions or predictions without being explicitly programmed for each specific task. (not the correct answer, I'm referring to the definition of physics not the definition of ML)

User: What's machine learning?

** AI **:- Machine learning is a field of artificial intelligence (AI) that focuses on developing algorithms and models that enable computers to learn from data and make predictions or decisions without being explicitly programmed. The primary goal of machine learning is to enable computers to automatically learn and improve from experience or examples. (He generated the same answer from the previous chat history while I need him to give more information to the user not just the same previous answer)

Example of the correct conversation I want to accomplish when using GPTIndexChatMemory

User:- Hi

AI:- Hi, how can I help you today?

User: What's machine learning?

AI:- Machine learning is a field of artificial intelligence (AI) that focuses on developing algorithms and models that enable computers to learn from data and make predictions or decisions without being explicitly programmed. The primary goal of machine learning is to enable computers to automatically learn and improve from experience or examples. (good answer referring to my previous question)

User:- I didn't get it.

AI:- I apologize if my previous response was not clear. Let me provide a simpler explanation: Machine learning is a branch of artificial intelligence (AI) that focuses on teaching computers to learn and make decisions or predictions without being explicitly programmed for each specific task. (good answer referring to my previous message and he knew that I didn't the definition of ML)

User:- what's the physics?

AI:- Physics is a natural science that studies the fundamental principles and laws governing the physical world. It seeks to understand the behavior of matter, energy, and the interactions between them. (good answer referring to my previous message)

User:- I didn't get it.

AI:- I apologize for any confusion caused. Let me provide a more simplified explanation of physics: Physics is the branch of science that deals with understanding and explaining how the world around us works. It is concerned with studying matter (everything that makes up the physical world) and energy (the ability to do work). (now it's good because he knows that I'm referring to the definition of physics not the definition of ML, although the "I didn't get it." message was saved in the previous chat history.

User: What's machine learning?

AI:- Machine learning is a field of artificial intelligence that focuses on developing algorithms and models capable of learning from data and making predictions or decisions. The primary idea behind machine learning is to enable computers to learn and improve automatically without explicit programming. (better answer although I repeated the same question, but he didn't get the same answer from the previous chat history)

I know that the problem with the memory because if I build my agent with ConversationBufferWindowMemory with k = he 1 will perform this type of conversation, but since I'm using GPTIndexChatMemory he saved all the messages and the questions and the answers of the full conversation in this memory and bring the same answer from the previous chat history if the message/question repeated, which is totally wrong.

This is my prompt to instruct my agent and its CONVERSATIONAL_REACT_DESCRIPTION

"""

SMSM bot, your main objective is to provide the most helpful and accurate responses to the user Zeyad. To do this, you have a powerful toolset and the ability to learn and adapt to the conversation's context

GOAL: The priority is to keep the conversation flowing smoothly. Offer new insights, avoid repetitive responses, and refrain from chat history without considering the most recent context. Always place emphasis on the most recent question or topic raised by the user, and tailor your responses to match his inquiries.

Consider the following scenarios:

Scenario 1: Whenever the user introduces a new topic, all his subsequent messages are assumed to refer to this latest topic, even if this message/question already exists in the previous chat history as it is in previous conversations under different topics. This context remains until the user changes the topic explicitly. Do not seek clarification on the topic unless the user's message is ambiguous within the context of the latest topic, For example, if the user asked about Machine Learning and then about Physics, and subsequently said, "I didn't get it," your responsibility is to provide further explanation about Physics (the latest topic), and not Machine Learning (the previous topic) or ask which topic he's referring to. The phrase "I didn't get it" must be associated with the most recent topic discussed.

Scenario 2: If the user asks the same question or a general knowledge question that has been asked before and you answered it, don't just repeat the previous answer verbatim or without relying on the previous chat history answer. Instead, try to add more value, provide a different perspective, or delve deeper into the topic and aim to generate a better and different answer that provides additional value.

You MUST use the following format to provide the answer to the user:

Thought: I have to see what the current topic we are currently discussing with the user based on the current topic, deeply analyze the user's message, find out his intention, and see if the user refers to the current topic or not regardless of previous chat history and with regarding (Scenario 1, GOAL) AI: [your response here]

Begin!

Prvious chat history: {chat_history}

New input: {input} """

That's the way I define the agent and my memory.

`embed_model = LangchainEmbedding(HuggingFaceEmbeddings()) service_context = ServiceContext.from_defaults(embed_model=embed_model)

index = GPTListIndex([],service_context=service_context) from llama_index.query_engine import RetrieverQueryEngine #retriever = index.as_retriever(retriever_mode='embedding') #query_engine = RetrieverQueryEngine(retriever) memory = GPTIndexChatMemory( index=index, memory_key="chat_history", query_kwargs={"response_mode": "compact"}, input_key="input", )

agent_chain = initialize_agent( tools, ChatOpenAI(temperature=0), agent=AgentType.CONVERSATIONAL_REACT_DESCRIPTION , verbose=True, handle_parsing_errors=True, memory = memory )`

Eng-ZeyadTarek avatar May 26 '23 16:05 Eng-ZeyadTarek

This is actually an issue with all AI memory in general not to langchain specifically. For the ai to differentiate between the 2 conversations you need something that also supplies context via meta-data or some other mechanism. temporal or some sort of time related info would help mitigate this issue. As far as the ai knows 'i didnt get it' was only associated with machine learning, so with something that gives weight to the most recent response could potentially make it give preference to that set of tokens, but that may not even be enough. When you say 'I didn't get it' your brain assigns a value to the nouns 'I' and 'it' in this sentence because they are in essence dynamic variables, the llm doesnt know this. 'it' was already filled by the response about machine learning, the ai doesnt know how to reassign that variable without instruction or some mechanism to do so. if you changed your prompt to say 'i didnt get the answer to the question about physics' it would likely have responded the way you want.

ether8unny avatar May 27 '23 04:05 ether8unny

Hi, @Eng-ZeyadTarek! I'm Dosu, and I'm helping the LangChain team manage their backlog. I wanted to let you know that we are marking this issue as stale.

From what I understand, you reported an issue with the GPTIndexChatMemory where the agent generates the same answer from the previous chat history if the message is repeated. Ether8unny commented that this is a general issue with AI memory and suggested that supplying context via metadata or time-related information could help mitigate the problem. Pedram-tehranchi reacted with a thumbs up to this comment.

Before we close this issue, we wanted to check if it is still relevant to the latest version of the LangChain repository. If it is, please let us know by commenting on the issue. Otherwise, feel free to close the issue yourself or it will be automatically closed in 7 days.

Thank you for your contribution to the LangChain repository!

dosubot[bot] avatar Sep 11 '23 16:09 dosubot[bot]