Classifier-Free Guidance support
From yesterday's discussion, here are some extra features to add:
- CFG support
- Mirostat sampling
- exl2 quantization support
This will help broaden support for newer methods of chatting with models.
Thanks for the request. As we discussed in private, mirostat support is on the way. After that, we'll focus on a few more deterministic samplers then we can finally move to CFG and grammars.
I have not experimented with exllamav2 kernels yet, but I will assume it'll be straightforward enough to migrate from v1 to v2.
At the moment, we've added support for:
- mirostat
- exllamav2 (though not the variable bitrates, GPTQ only)
CFG support is not planned and will likely not happen in the foreseeable future, as it slows down generations.
Re-opening this issue so we can keep track of CFG support. After discussing internally, we decided to add it (but not as a high priority addition). We'll likely need to separate CFG requests to their own unique batches, so the throughput cost doesn't affect regular requests.
Just saying that as an individual user, having CFG support with 8 bit KV cache is, obviously, really useful. exl2 does this too and it's killer. The net gain for quality feels like it's cheating. I wonder if there'd be room to further squeeze the second cache - given that it's useful even when you just pass an empty string (or <s>) as the prompt? I don't suppose the maths of CFG still makes sense with a shorter negative context? ...maybe just kludge it with "I pinky promise that we found a sink token and not a null reference down there in the context mr transformer don't worry about it just keep up that forward() okay buddy!"
I feel it might not even be a bad thing if the negative guidance was, yknow, low quality? It's gotta stay on topic but /shrug
Not like I'd know. So I'm greatful that you guys do. Keep it up and thanks for your work.