llama.cpp
llama.cpp copied to clipboard
llama: introduce support for model-embedded sampling parameters
ref: #17088
This PR introduces the feature to allow sampler parameters to be set from GGUF KV metadata allowing model creators to embed recommended sampler settings unless explicitly overridden using the CLI flags.
Handy for users who do not want to tinker with the settings but want the recommended settings applied.
Priority of Sampling Parameters
- User flags (i.e., setting
--temp 0.6) - Model-Embedded recommendation (i.e.,
general.sampling.temp = 0.6) - Default hardcoded values in
common_params_sampling
Introduced Metadata
general.sampling.sequencegeneral.sampling.top_kgeneral.sampling.top_pgeneral.sampling.min_pgeneral.sampling.xtc_probabilitygeneral.sampling.xtc_thresholdgeneral.sampling.tempgeneral.sampling.penalty_last_ngeneral.sampling.penalty_repeatgeneral.sampling.mirostatgeneral.sampling.mirostat_taugeneral.sampling.mirostat_eta
Please let me know if we should introduce more sampling parameters.
Embedding From Safetensors into GGUF
By default, it will attempt to find the generation_config.json within the model directory and automatically add recommended sampler parameters into the GGUF metadata. If a sampling parameter is not available within the file, users can also specify --metadata metadata.json.
Note that --metadata metadata.json takes precedence over generation_config.json and will overwrite metadata if duplicate keys are found.
$ cat > metadata.json << EOF
{
"general.sampling.temp": 0.6
}
EOF
$ python3 convert_hf_to_gguf.py --outfile deepseek-r1-distill-qwen-1.5b.gguf --metadata metadata.json deepseek-r1-distill-qwen-1.5b/
$ ./build/bin/llama-cli -m deepseek-r1-distill-qwen-1.5b.gguf -p "Write me a dog walking business idea 1. " -no-cnv -n 1 -t 10 2>&1 | grep "temp"
llama_model_loader: - kv 2: general.sampling.temp f32 = 0.600000
llama_model_loader: - kv 27: tokenizer.chat_template str = {% if not add_generation_prompt is de...
top_k = 40, top_p = 0.950, min_p = 0.050, xtc_probability = 0.000, xtc_threshold = 0.100, typical_p = 1.000, top_n_sigma = -1.000, temp = 0.600
sampler chain: logits -> logit-bias -> penalties -> dry -> top-n-sigma -> top-k -> typical -> top-p -> min-p -> xtc -> temp-ext -> dist
$ cat > metadata.json << EOF { "general.sampler.temp": 0.6 } EOF
So, you're suggesting that parameters should be added manually before conversion? How likely is that to happen?
AFAIK most models come with recommended (though, some are likely to just be copy-pasted from somewhere) settings in generation_config.json, so perhaps a better idea to get them from there?
Edit: or is that automatically added to metadata?
$ cat > metadata.json << EOF { "general.sampler.temp": 0.6 } EOFSo, you're suggesting that parameters should be added manually before conversion? How likely is that to happen?
AFAIK most models come with recommended (though, some are likely to just be copy-pasted from somewhere) settings in
generation_config.json, so perhaps a better idea to get them from there?
You're right, I didn't spot that. Well I guess I have to rework the code such that it pulls generation_config.json from the model directory, maps to general.sampler.* and we can skip the --metadata flag.
I think sampling sequence is important too. Also I personally only really tend to use min-p and xtc(not in your proposal).
@Green-Sky Will include general.sampler.xtc_probability and general.sampler.xtc_thresold first then --samplers SEQUENCE.
@CISC RE generation_config.json vs. the custom --metadata file, I've realised that generation_config.json does not actually document (non-standard) support for parameters such as mirostat. In this case, we'll still need support for --metadata metadata.json to cover these parameters, unless there is a better way of handling this.
@CISC RE
generation_config.jsonvs. the custom--metadatafile, I've realised thatgeneration_config.jsondoes not actually document (non-standard) support for parameters such asmirostat. In this case, we'll still need support for--metadata metadata.jsonto cover these parameters, unless there is a better way of handling this.
Does transformers even have this parameter?
@CISC RE
generation_config.jsonvs. the custom--metadatafile, I've realised thatgeneration_config.jsondoes not actually document (non-standard) support for parameters such asmirostat. In this case, we'll still need support for--metadata metadata.jsonto cover these parameters, unless there is a better way of handling this.Does
transformerseven have this parameter?
Doesn't look like it. Followed some of Ollama's supported parameters: https://ollama.readthedocs.io/en/modelfile/#parameter
@CISC any update on this PR?
@CISC any update on this PR?
Thanks for the reminder. :)