HanishKVC
HanishKVC
Please do have a look at the code in the below PR. Around the time when llama3 came out, I had a need to look at llama.cpp and inturn I...
Adding this attached patch to this PR, allows me to chat with llama3 also using main -i --chaton llama3 [llamacpp-llama3-exp-v1.patch](https://github.com/ggerganov/llama.cpp/files/15049481/llamacpp-llama3-exp-v1.patch)
> This sounds like an excellent and much needed addition to main. Did you add a flag for specifying the system roles message? In interactive mode (ie -i) any prompt...
There is a new PR, which is again a experiment which tries to use a simple minded json file to try and drive the logic, so that many aspects can...
Hi @khimaros, This patch auto sets the example/main's in-prefix/suffix as well as antiprompt/reverse-prompt from the equivalent configuration data in the specified chaton_meta.json file, that is the reason, why its no...
Hi @ggerganov @ngxson @teleprint-me @khimaros @mofosyne The initial/previous version was rooted around a json object, while the new version is rooted around a MapOfMapOfVariant (GroupKV), which could be preloaded with...
Hi @ggerganov @ngxson @mofosyne Just to give a rough context, for a code using the existing chat template logic like examples/server, a simple change like below will allow it to...