llama-cpp-python
llama-cpp-python copied to clipboard
Actually create a random seed when using seed = -1 on load
Set a random initial seed if using -1 as the seed argument, like stated in the API reference here:
seed (int, default: LLAMA_DEFAULT_SEED ) –
RNG seed, -1 for random
At the moment there is no random seed created, so the first reply of the model when not using a fixed seed, will always be the same, as will be the chain of consecutive replies.
Fixes issue: https://github.com/abetlen/llama-cpp-python/issues/1809