llama.cpp
llama.cpp copied to clipboard
Ability to take in a config file as initial prompt
Following on to the "Store preprocessed prompts", it would be good to be able to take in a text file with a generic prompt & flags to start a chatbot or similar. Such a config file could be a yaml or toml and include flags for running, model locations, prompt locations, etc.
On Windows I just use CMD files. I can just drop a prompt file on the CMD script, or use the terminal. Here is an example:
@REM explicitly state all possible parameters:
if NOT [%1]==[] llama.exe ^
--model "llama-30B\ggml-model-q4_0.bin" ^
--seed 1973 ^
--threads 15 ^
--n_predict 1024 ^
--repeat_last_n 64 ^
--batch_size 8 ^
--repeat_penalty 1.4 ^
--top_k 40 ^
--top_p 0.9 ^
--temp 0.8 ^
--file %1 2>>llama.stderr.txt
Then I have another one setup for interactive mode - "User:" style prompt files, etc.
In bash, you can just load a prompt directly into the string:
./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"
Thanks @nschulzke, I have been doing this, but I would ideally want to do something like:
./main
and it look for a config file, and if it finds it, use it.
@nschulzke
In bash, you can just load a prompt directly into the string:
./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"
You could also just use -f
/--file
.
But for @MLTQ, I think using a driver script is really the way to go, especially if you want to use a configuration language like YAML. I wrote a Python script to do exactly that. You're welcome to modify it to suit your purposes.