llama.cpp icon indicating copy to clipboard operation
llama.cpp copied to clipboard

Ability to take in a config file as initial prompt

Open MLTQ opened this issue 1 year ago • 4 comments

Following on to the "Store preprocessed prompts", it would be good to be able to take in a text file with a generic prompt & flags to start a chatbot or similar. Such a config file could be a yaml or toml and include flags for running, model locations, prompt locations, etc.

MLTQ avatar Mar 14 '23 20:03 MLTQ

On Windows I just use CMD files. I can just drop a prompt file on the CMD script, or use the terminal. Here is an example:

@REM explicitly state all possible parameters:
if NOT [%1]==[] llama.exe ^
	--model			"llama-30B\ggml-model-q4_0.bin" ^
	--seed			1973	^
	--threads		15	^
	--n_predict		1024	^
	--repeat_last_n		64	^
	--batch_size		8	^
	--repeat_penalty	1.4	^
	--top_k			40	^
	--top_p			0.9	^
	--temp			0.8	^
	--file %1 2>>llama.stderr.txt

Then I have another one setup for interactive mode - "User:" style prompt files, etc.

bitRAKE avatar Mar 15 '23 00:03 bitRAKE

In bash, you can just load a prompt directly into the string:

./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"

nschulzke avatar Mar 15 '23 01:03 nschulzke

Thanks @nschulzke, I have been doing this, but I would ideally want to do something like:

./main and it look for a config file, and if it finds it, use it.

MLTQ avatar Mar 15 '23 04:03 MLTQ

@nschulzke

In bash, you can just load a prompt directly into the string:

./main -m ./models/65B/ggml-model-q4_0.bin -t 8 -n 256 -p "$(<FILE_NAME_HERE)"

You could also just use -f/--file.

But for @MLTQ, I think using a driver script is really the way to go, especially if you want to use a configuration language like YAML. I wrote a Python script to do exactly that. You're welcome to modify it to suit your purposes.

saites avatar Mar 18 '23 20:03 saites