lmql icon indicating copy to clipboard operation
lmql copied to clipboard

[ SUGGESTION ] Indicate which features are not available with chat models

Open ibehnam opened this issue 2 years ago • 1 comments

I miss the old days when OpenAI's main models were completion models. But their main current models are only chat models (gpt-3.5-turbo and gpt-4). That changes everything with regard to LM programming frameworks such as LMQL and guidance.

On the website and in the documentation examples, I see a lot of really cool stuff that can be done by LMQL and I've contemplated about using it in my next major project. But it would be helpful to indicate the features that require completion models, i.e., the parts of LMQL that don't apply to chat models. That way, me and my team would have realistic expectations about what can be achieved by LMQL and its value added as opposed to simple API callings as we normally do.

ibehnam avatar Sep 12 '23 05:09 ibehnam

Thanks for the suggestion. We will try to add some more information to the docs. In general, simple constraints like len(TOKEN(..))), STOPS_AT and STOPS_BEFORE are possible with ChatGPT. More advanced things like Regex or VAR in [...] are currently not possible, although we started to work on better support for this. The major road blocker here is still that Chat models do not allow the continuation of partial assistant responses with different logit_bias.

lbeurerkellner avatar Sep 14 '23 17:09 lbeurerkellner