FastChat
FastChat copied to clipboard
fix: get_prompt goes wrong for Llama-2
trafficstars
Why are these changes needed?
When adding the first user flag (with a message of "None"), and then check the conversation prompt like this:
conv_template.append_message(conv_template.roles[0], None)
conv_template.get_prompt()
The output is "[INST] [INST]". I think this must be a tiny hidden bug for function get_prompt (Llama-2), and I figure out how to fix it.
Related issue number (if applicable)
Some other open-source repositories will use this function to mark the specific slices in the conversation. So this bug may trigger some subtle bug or hard code.
Checks
- [x] I've run
format.shto lint the changes in this PR. - [x] I've included any doc changes needed.
- [x] I've made sure the relevant tests are passing (if applicable).