Prompt template for WizardLM-2-8x22B?
What is the prompt template for WizardLM-2-8x22B in the .env.local?
When setting it to the default one: <s>{{#each messages}}{{#ifUser}}[INST] {{#if @first}}{{#if @root.preprompt}}{{@root.preprompt}}\n{{/if}}{{/if}}{{content}} [/INST]{{/ifUser}}{{#ifAssistant}}{{content}}</s>{{/ifAssistant}}{{/each}}
the generated output is very odd and incoherent.
When setting the prompt template to the one displayed in the model card: {system_prompt} USER: {prompt} ASSISTANT: </s>
the output gets even worse.
Can anyone help?
Hi! So we recommend using the tokenizer to fetch the chat prompt template. Remove chatPromptTemplate from your config and set
"tokenizer": "alpindale/WizardLM-2-8x22B" in your model config
that should hopefully work
@nsarrazin Thanks for this solution, but it isn't optimal for me unfortunately, because then I have to enter my HF_Token and the ChatUI needs to connect with the internet, which is not ideal for my privacy requirements.
Is there a way to do this fully offline?
I find zephyr template suits it well
<|user|>
{User}
<|assistant|>
{Assistant}