llamafile
llamafile copied to clipboard
instruct chat templates
Prerequisites
- [X] I am running the latest code. Mention the version if possible as well.
- [X] I carefully followed the README.md.
- [x] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
- [X] I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
would be nice to directly support a number of templates(besides Alpaca instruct).
Motivation
this would facilitate eg easy api usage and inclusion in pipelines that invoke open ai compatible inference servers.
Possible Implementation
could be done like llama.cpp does from #5538 on, which probably could be largely merged in and wrapped?