ollama
ollama copied to clipboard
Default Stop Sequence is not working when user provides additional stop sequences
What is the issue?
I want to stop generation at stop sequence "\nObservation".
My expectation here is generation will stop at either "\nObservation" or models default stop sequence "<|eot_id|>" (in llama3 case).
But it does not happen and model keep on generating answer.
Request:
Response:
I think we should add stop sequences to default stop sequence instead of replacing them
OS
Windows
GPU
No response
CPU
Intel
Ollama version
0.1.27