Nicholas Chammas

Results 229 comments of Nicholas Chammas

Is it possible this issue is limited to OpenAI models? I cannot reproduce this using Phi 3.5. ```python from guidance import models, gen, system, user, assistant from guidance.chat import Phi3MiniChatTemplate...

Yes, the 0.2.0 release is [broken in various ways][1], unfortunately, and we were [promised][2] a new release back in March. Perhaps the team is waiting to finish some ongoing work...

Should this issue be closed or retitled given that the latest release is v0.3.5?

Being able to filter on custom tags would also be very helpful.

It looks like support for OpenAI is in general very limited. Trying to call `gen(stop="\n")`, for example, yields: ``` ValueError: Stop condition not yet supported for OpenAI ``` I know...

> With local models, we have full control over the entire inference stack and can thus run any constraint. To be clear, are you saying that today it's not possible...

> That being said, you could for example deploy vLLM with a hosted azure endpoint and get full guidance support (as vLLM has the required integrations). It would be great...

> Your repository uses the model TheBloke/Llama-2-7B-32K-Instruct-GGUF at guidance/bench/_powerlift.py, which is licensed under llama2. Meanwhile, your repository itself is licensed under MIT. The file you are referencing does not exist....

Apart from RAG and similar instances of tool use, when would a user want to suppress output? In other words, perhaps instead of `with silent():` [what we need is `with...

Looking at [the notes here][1] (referenced in the docs for this component), it looks like there is an `annotations` property for this purpose: > annotations to show in the editor...