langchaingo
langchaingo copied to clipboard
Conversational chat and `memory.NewConversationBuffer()`
Can a conversational chat be implemented with the memory buffer (memory.NewConversationBuffer()) with llms.GenerateFromSinglePrompt?
I can't find where I can inject the history of the chat.
Is it by using prompts.NewChatPromptTemplate or should I use llmChain := chains.NewConversation(llm, memory)?
In the last case,e I don't understand how to stream the response
with GenerateFromSinglePrompt I can do something like this:
llms.GenerateFromSinglePrompt(ctx, llm, promptText1,
llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
fmt.Print(string(chunk))
return nil
}))
But I'm not able to do the same thing with chains.Run
any help would be appreciated
I think I found a way https://github.com/genai-for-all/learning-langchain-go/blob/main/04-let-s-go-2/main.go
Now, I need to adapt this sample with prompt templates