Salmon Coder
Salmon Coder
If you're willing to manually retype the conversation history, then you can get your question answered, like so: 
After playing around with it some more, I'm somewhat more confused -- but I no longer think that the model doesn't have 'conversational memory'. Also, the chat.cpp file is identical...
I am working on a version that more explicitly conveys the idea to Llama that there is a single-threaded conversation and its job is only to respond to the user....