alpaca.cpp icon indicating copy to clipboard operation
alpaca.cpp copied to clipboard

Is there a way to make it remember converastion like chatgpt?

Open gsgoldma opened this issue 1 year ago • 6 comments

Right now, every prompt seems to generate a brand new response with no memory of the previous conversation.

gsgoldma avatar Mar 23 '23 21:03 gsgoldma

you need to feed the history in every prompt.

K0NTRA203 avatar Mar 23 '23 21:03 K0NTRA203

I join in the question. How do you technically implement this? through a separate uploaded file?

RichardTLabble avatar Mar 24 '23 14:03 RichardTLabble

Alpaca never will like ChatGPT

MistFenix avatar Mar 24 '23 21:03 MistFenix

@gsgoldma unfortunately, I don't think there is a way. But the upstream repo's interactive chat option does remember chat history. Since they added alpaca support you can try using that.

vicfic18 avatar Mar 25 '23 09:03 vicfic18

you need to feed the history in every prompt.

I ask 'write a story about xxx'. I got response 'bla-bla about xxx'. I ask 'continue story: bla-bla about xxx'. I'm getting the same result again: 'bla-bla about xxx'

openMolNike avatar Mar 26 '23 18:03 openMolNike

@RichardTLabble

You have to implement it on your own by defining and storing the conversation context in the memory of your Alpaca "wrapper" using this technique, for example:

You start with this prompt which includes a "context prefix" and the user first request:

Write a response that answers the request according to the conversation history below.

Request: Hello, my name is Samuel.
Response:

Alpaca returns:

Response: Hello, how can I assist you?

You continue with this prompt which concatenates/merges the Alpaca previous return with a new user request:

Write a response that answers the request according to the conversation history below.

Request: Hello, my name is Samuel.
Response: Hello, how can I assist you?

Request: What is my name?
Response:

Alpaca returns contextually:

Response: Samuel

And so on...

The technique described above works well but, to optimize the size of the conversation context memory, you may need to use a strategy, for example: summarize some parts of the conversation history.

Inspiration: https://github.com/deep-diver/Alpaca-LoRA-Serve#context-management

SamuelTallet avatar Mar 29 '23 20:03 SamuelTallet