camel
camel copied to clipboard
[Feature Request] message_to_prompt -> .apply_chat_template
Required prerequisites
- [X] I have searched the Issue Tracker and Discussions that this hasn't already been reported. (+1 or comment there if it has.)
- [ ] Consider asking first in a Discussion.
Motivation
In current support for open-souce models we have a function called message_to_prompt
to convert the openai user-assistant alternating format into model corresponding template. I believe it is just what the .apply_chat_template
method do. For models which don't have a build-in template or need special treatment we could consider writing a function for these use cases.
ref: https://huggingface.co/docs/transformers/main/en/chat_templating https://github.com/camel-ai/camel/blob/d7e49241bcfe86b1e010912ab7d3b4ab3d993d7c/camel/utils/token_counting.py#L21
Solution
No response
Alternatives
No response
Additional context
No response
Emmm ok I see some inflexibilities of directly applying this method like it cannot apply to a single message but should be a chat history. During inference it is fine since we always fed the whole history to agents, but the rest of the time like how we need it in the score_based.py it just doesn't work. Maybe we should come up a better solution to utilize the tokenizer shipped chat_template and some customization of ours.