Brandon T. Willard
Brandon T. Willard
> This is dead code which should be cleaned up after > > #1010 The relevant code now seems to be [here](https://github.com/dottxt-ai/outlines/blob/6035e86ac8089d4f8aeab07ea116093a2ed0e03e/outlines/processors/structured.py#L105); is that what will be updated in #1010?
#1192 does not appear to update the logits arrays in place, at least not without creating another array the size of the logits.
Looks like this might need to be moved after https://github.com/dottxt-ai/outlines/issues/1206.
Can you provide a complete minimal working example of the `outlines` code you're using to get that result? In general, don't call generator constructors like `outlines.generate.choice` in a loop. Only...
Converting this back into an issue because it does mostly describe a bug-like situation with the current implementation. Design and approach proposals should take place in the discussion, though.
See the code in the `parsing` module.
Can you provide all the results of `conda/pip list`?
> * warning in CFGGuide while [these issues aren't resolved](https://github.com/outlines-dev/outlines/pull/1067#issuecomment-2251396332) We need a more general and persistent warning that explains that the CFG implementation is (and has always been) experimental...
I noticed from the helpful test case in #1129—and [the relevant `vllm` code](https://github.com/vllm-project/vllm/blob/551ce01078a655068e5ec3764d0a55ac744ea425/vllm/model_executor/guided_decoding/outlines_logits_processors.py#L91)—that the cache decorator is being directly applied to the function object, so we may not even be...
It looks like this is a feature request for `Enum` and `constr` prompts, and it's not entirely clear to me how we should translate those to prompts. Do we want...