code-llama-for-vscode
code-llama-for-vscode copied to clipboard
Error: special tags are not allowed as part of the prompt.
It will return Error: special tags are not allowed as part of the prompt.
as an error. No settings adjusted, completely fresh instance of vscode and the continue extension
I'm seeing the same thing. Relevant thread on reddit: https://old.reddit.com/r/MachineLearning/comments/167amdt/dwhy_are_special_tokens_not_allowed_in_the_prompt/
I don't see any special tags in instructions
though. I'm trying to have it complete
# load csv and plot results
def load_and_plot(csv_path):
And the contents of instructions
is this
[[{'role': 'user', 'content': '[INST] Code in this file is highlighted (/home/user/codellama/local_test.py):\n```\n# load csv and plot results\ndef load_and_plot(csv_path):\n\n``` [/INST][INST] # load csv and plot results\ndef load_and_plot(csv_path):\n [/INST]'}]]
they are generated by continue. it turned out you have to replace this line
https://github.com/facebookresearch/codellama/blob/9cf7caef699f61386ab64bebd0fa53a277478e51/llama/generation.py#L54
with an empty array. apparently this is to fight back against instruction injection, not entirely in the topic of this yet, why they would have added to an instruction type model though.
how fix this
read my answer before
I think what happened here is Continue changed how they passed the prompt which broke my code. I've updated the code to the latest version of Continue so this shouldn't be a problem anymore.
Fyi special tags here are strings that should not be passed by the user or else the user could break or disrupt the usage of the LLM. Input to an LLM is ultimately just a string with these special tags, so if a user passes a special tag, it could confuse the LLM.
I'll go ahead and close this issue as I believe it's been resolved.