Salmon Coder

Results 3 comments of Salmon Coder

If you're willing to manually retype the conversation history, then you can get your question answered, like so: ![Screenshot from 2023-03-18 11-11-05](https://user-images.githubusercontent.com/112276931/226118469-17c42559-c9e5-4e33-aed3-0ebaa0a25a1f.png)

After playing around with it some more, I'm somewhat more confused -- but I no longer think that the model doesn't have 'conversational memory'. Also, the chat.cpp file is identical...

I am working on a version that more explicitly conveys the idea to Llama that there is a single-threaded conversation and its job is only to respond to the user....