llama.cpp
llama.cpp copied to clipboard
Added script to invoke alpaca model
Resolves https://github.com/ggerganov/llama.cpp/issues/240
WIP
This needs to be able to:
- Configure custom model folders.
- Adjust settings for running variants of the Alpaca model and make corresponding changes in the C++ side.
- Provide the ability to configure parameters.
- Possibly hide the extraneous template it returns.
- Ability to run interactive mode
I made some changes to on master
already to support the interactive mode where we inject ### Instruction:
prefix and ### Response:
suffix in the input. I don't know if this is really needed, but this is what the alpaca.cpp guys are using, so I added it.
I think our original --interactive
mode actually works better, but don't have too much evidence.
If you adapt this python script to the new changes and update the README with usage instruction - we can merge.
I think this got a little outdated.
If this is still of interest, please move the script to the examples
folder and add some brief usage instructions to the README file and reopen