alpaca.cpp
alpaca.cpp copied to clipboard
Is there a way to make it remember converastion like chatgpt?
Right now, every prompt seems to generate a brand new response with no memory of the previous conversation.
you need to feed the history in every prompt.
I join in the question. How do you technically implement this? through a separate uploaded file?
Alpaca never will like ChatGPT
@gsgoldma unfortunately, I don't think there is a way. But the upstream repo's interactive chat option does remember chat history. Since they added alpaca support you can try using that.
you need to feed the history in every prompt.
I ask 'write a story about xxx'. I got response 'bla-bla about xxx'. I ask 'continue story: bla-bla about xxx'. I'm getting the same result again: 'bla-bla about xxx'
@RichardTLabble
You have to implement it on your own by defining and storing the conversation context in the memory of your Alpaca "wrapper" using this technique, for example:
You start with this prompt which includes a "context prefix" and the user first request:
Write a response that answers the request according to the conversation history below.
Request: Hello, my name is Samuel.
Response:
Alpaca returns:
Response: Hello, how can I assist you?
You continue with this prompt which concatenates/merges the Alpaca previous return with a new user request:
Write a response that answers the request according to the conversation history below.
Request: Hello, my name is Samuel.
Response: Hello, how can I assist you?
Request: What is my name?
Response:
Alpaca returns contextually:
Response: Samuel
And so on...
The technique described above works well but, to optimize the size of the conversation context memory, you may need to use a strategy, for example: summarize some parts of the conversation history.
Inspiration: https://github.com/deep-diver/Alpaca-LoRA-Serve#context-management