gpt4all
gpt4all copied to clipboard
Problem with phi-3.5-mini-instruct chat template and endless generation
Bug Report
I want to use the new model Phi-3.5-mini-instruct and downloaded the file Phi-3.5-mini-instruct-Q5_K_M.gguf from https://huggingface.co/bartowski/Phi-3.5-mini-instruct-GGUF
There is the following prompt format stated:
<|system|> {system_prompt}<|end|><|user|> {prompt}<|end|><|assistant|>
Therefore I used that information in the settings. As the result I have the following section in the file GPT4All.ini :
[model-Phi-3.5-mini-instruct-Q5_K_M.gguf] filename=Phi-3.5-mini-instruct-Q5_K_M.gguf name=Phi-3.5-mini-instruct promptTemplate=<|user|>\n%1<|end|>\n<|assistant|>\n systemPrompt=<|system|>\nYou are a helpful assistant.<|end|>\n
But when I use the model sometimes after the answer an new question is automatically generated an answered. I suppose the reason for that has to do with the prompt template or with the processing of the prompt template.
Can I modify the prompt template for the correct function of this model (and similar for other models I download from Hugging Face)?
There seems to be information about the prompt template in the GGUF meta data. Would it be possible that this information is automatically used by GPT4All?
Steps to Reproduce
- Download the model stated above
- Add the above cited lines to the file
GPT4All.ini - Start GPT4All and load the model Phi-3.5-mini-instruct
- Ask a simple question (maybe several times)
Expected Behavior
Only the questions from the user should be answered and no new question or task should be generated.
Your Environment
- GPT4All version: 3.2.1
- Operating System: Linux
- Chat model used (if applicable): Phi-3.5-mini-instruct