Conversation With Memory
My goal is creating conversation with memory
My Input: my name is rotan AI Answer: hello rotan, how are you today ? -> the answer is something like this My Input: what is my name ? AI Answer: your name is rotan -> Expected Result: can take an answer from my previous input
llm, err := openai.New()
if err != nil {
fmt.Println("err create llm ", err)
}
mem := memory.NewConversationBuffer()
llm2 := chains.NewConversation(llm, mem)
reader := bufio.NewReader(os.Stdin)
for {
// ReadString will block until the delimiter is entered
input, err := reader.ReadString('\n')
if err != nil {
fmt.Println("An error occured while reading input. Please try again", err)
return
}
ctx := context.Background()
// remove the delimeter from the string
input = strings.TrimSuffix(input, "\n")
completion, _ := llm2.Call(ctx, map[string]any{
"history": []string{},
"input": input,
})
chatHistory := mem.ChatHistory
mem = memory.NewConversationBuffer(memory.WithChatHistory(chatHistory))
llm2 = chains.NewConversation(llm, mem)
fmt.Println(completion)
}
Expected Result: AI can take an answer from my previous input
Actual Result: AI doesn't know my name, because of I replace the llm2 object with NewConversation ? so the Prompt will create NewPromptTemplate ? and what should the value of history ?
I followed this example to use memory, it works for me https://github.com/tmc/langchaingo/blob/main/examples/llm-chain-example/llm_chain.go
llm, err := openai.New()
memory := memory.NewConversationBuffer()
llmChain := chains.NewConversation(llm, memory)
out, err := chains.Run(ctx, llmChain, "my name is xxx",
chains.WithCallback(callbacks.StreamLogHandler{}),
out2, err := chains.Run(ctx, llmChain, "what is my name ?",
chains.WithCallback(callbacks.StreamLogHandler{}),
)
what if i want to save the memory's chat history message to db, and i will load llm with the memory that i saved ? it must defined manually like this ?
mem := memory.NewConversationBuffer(memory.WithChatHistory(memory.NewChatMessageHistory(
memory.WithPreviousMessages([]schema.ChatMessage{
schema.HumanChatMessage{Content: ""},
schema.AIChatMessage{Content: ""},
}),
)))
I followed this example to use memory, it works for me https://github.com/tmc/langchaingo/blob/main/examples/llm-chain-example/llm_chain.go
llm, err := openai.New() memory := memory.NewConversationBuffer() llmChain := chains.NewConversation(llm, memory) out, err := chains.Run(ctx, llmChain, "my name is xxx", chains.WithCallback(callbacks.StreamLogHandler{}), out2, err := chains.Run(ctx, llmChain, "what is my name ?", chains.WithCallback(callbacks.StreamLogHandler{}), )
This one dosn't works for me, it says "cannot use llmChain (variable of type chains.LLMChain) as chains.Chain value in argument to chains.Run: missing method Call"
It looks like a bug
Does it works for you with latest version?
Does it works for you with latest version?
I didn’t encountered any problem using memory in a chain conversion, This is my test code:
llm, _ := ollama.New(ollama.WithModel("phi"))
memory := memory.NewConversationBuffer()
ctx := context.TODO()
llmChain := chains.NewConversation(llm, memory)
out, _ := chains.Run(ctx, llmChain, "my name is aciegn")
fmt.Println(out)
out2, _ := chains.Run(ctx, llmChain, "what is my name ?")
fmt.Println(out2)
langchaingo version:
require github.com/tmc/langchaingo v0.1.4
what if i want to save the memory's chat history message to db, and i will load llm with the memory that i saved ? it must defined manually like this ?
mem := memory.NewConversationBuffer(memory.WithChatHistory(memory.NewChatMessageHistory( memory.WithPreviousMessages([]schema.ChatMessage{ schema.HumanChatMessage{Content: ""}, schema.AIChatMessage{Content: ""}, }), )))
Hey, I found out easier way to do it:
func CreateChatWithContextNoLimit(api_token string, model_name string) (string, error) {
ctx := context.Background()
token := api_token
llm, err := openai.New(
openai.WithToken(token),
openai.WithModel(model_name),
//llms.WithOptions()
//openai.WithBaseURL("http://localhost:8080/v1/"),
//openai.WithAPIVersion("v1"),
)
if err != nil {
log.Fatal(err)
}
memory_buffer := memory.NewConversationBuffer()
//test data
// First dialogue pair
inputValues1 := map[string]any{"input": "Hi"}
outputValues1 := map[string]any{"output": "What's up"}
// Second dialogue pair
inputValues2 := map[string]any{"input": "Not much, just hanging"}
outputValues2 := map[string]any{"output": "Cool"}
memory_buffer.SaveContext(ctx,inputValues1,outputValues1)
memory_buffer.SaveContext(ctx,inputValues2,outputValues2)
//memory_buffer.ChatHistory.AddUserMessage(ctx,"Hi!")
//memory_buffer.ChatHistory.AddAIMessage(ctx,"What's up")
memory_buffer.ChatHistory.AddUserMessage(ctx, "I am working at my new exiting golang AI project called 'Andromeda'")
memory_buffer.ChatHistory.AddUserMessage(ctx, "My name is Bekket btw")
conversation := chains.NewConversation(llm,memory_buffer)
result, err := chains.Run(ctx,conversation,"what is my name and what project am I currently working on?")
if err != nil {
return "", err
}
log.Println(result)
return result,err
}
So you can create empty memory buffer, then populate it with initial data as maps (it's important to pass first messages like this)
After you can just add new messages to memory buffer (or you can store buffer in some db and load it) like memory_buffer.ChatHistory.AddMessage()
then you creating dialog thread, passing buffer or previouse messages and then call chains.Run
Does it works for you with latest version?
I didn’t encountered any problem using memory in a chain conversion, This is my test code:
llm, _ := ollama.New(ollama.WithModel("phi")) memory := memory.NewConversationBuffer() ctx := context.TODO() llmChain := chains.NewConversation(llm, memory) out, _ := chains.Run(ctx, llmChain, "my name is aciegn") fmt.Println(out) out2, _ := chains.Run(ctx, llmChain, "what is my name ?") fmt.Println(out2)langchaingo version:
require github.com/tmc/langchaingo v0.1.4
Yep, it was just a bug in my IDE linter, it compiles manually from terminal, everything is ok
In fact, it would be nice to see some examples with memory works in this repo
Yes, it can't work on version0.1.9 about history conversation! ex: conversatons = []sting{"what is phone number", "this phone number is 123456"}, but result is I don't know (no error).It discards the chat history.
`history := memory.NewChatMessageHistory() for index, conversation := range conversations { if index&0x1 == 1 { history.AddAIMessage(ctx, conversation) } else { history.AddUserMessage(ctx, conversation) } }
chain := chains.NewConversationalRetrievalQA(
chains.LoadStuffQA(qar.data.llm[index]),
chains.LoadCondenseQuestionGenerator(qar.data.llm[index]),
&Retrieve{
},
memory.NewConversationBuffer(memory.WithChatHistory(history)),
)
result, err := chains.Run(ctx, chain, "what is the phone number", options...)`