langchaingo icon indicating copy to clipboard operation
langchaingo copied to clipboard

Conversational chat and `memory.NewConversationBuffer()`

Open k33g opened this issue 2 years ago • 1 comments

Can a conversational chat be implemented with the memory buffer (memory.NewConversationBuffer()) with llms.GenerateFromSinglePrompt?

I can't find where I can inject the history of the chat.

Is it by using prompts.NewChatPromptTemplate or should I use llmChain := chains.NewConversation(llm, memory)?

In the last case,e I don't understand how to stream the response

with GenerateFromSinglePrompt I can do something like this:

llms.GenerateFromSinglePrompt(ctx, llm, promptText1, 
		llms.WithStreamingFunc(func(ctx context.Context, chunk []byte) error {
			fmt.Print(string(chunk))
			return nil
		}))

But I'm not able to do the same thing with chains.Run

any help would be appreciated

k33g avatar Apr 10 '24 17:04 k33g

I think I found a way https://github.com/genai-for-all/learning-langchain-go/blob/main/04-let-s-go-2/main.go

Now, I need to adapt this sample with prompt templates

k33g avatar Apr 10 '24 20:04 k33g