Marco Tulio Correia Ribeiro

Results 47 comments of Marco Tulio Correia Ribeiro

I think the new [release](https://github.com/guidance-ai/guidance/discussions/429) makes this all much easier, as what you have is basically python code. Please check it out :)

If you set `rest_call=True` and `endpoint` when initializing `guidance.llms.OpenAI`, it will make a REST call to the endpoint (which can be a proxy, as long as it accepts the same...

Please let us know if this is still an issue In the new [release](https://github.com/guidance-ai/guidance/discussions/429) :)

Hopefully this is gone in the new [release](https://github.com/guidance-ai/guidance/discussions/429) :) Thanks for pointing this problem out!

In the new [release](https://github.com/guidance-ai/guidance/discussions/429) we did away with template strings, and now use python's f-strings. Thanks so much for your initial work on this!

Sorry it took us so long to get to this. In the new [release](https://github.com/guidance-ai/guidance/discussions/429), it's supported [(see this)](https://github.com/guidance-ai/guidance#vertex-ai)

We tested the new [release](https://github.com/guidance-ai/guidance/discussions/429) with chat models from OpenAI, so hopefully it works now :)

We did away with handlebars in the new [release](https://github.com/guidance-ai/guidance/discussions/429). Hopefully f-strings still serve your purposes :)

In the new [release](https://github.com/guidance-ai/guidance/discussions/429) we also support [llama.cpp](https://github.com/guidance-ai/guidance#llamacpp), which loads models much faster :)