guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Guidance with OpenAI compatible servers

Open AntoineBlanot opened this issue 9 months ago • 3 comments

Use OpenAI compatible servers A lot of recent frameworks (llama.cpp, vLLM, and other...) make their models available through an OpenAI compatible API. I think it would be awesome if we could use the OpenAI client as the entry point for any models (OpenAI models but also llama.cpp and vLLM models).

I have tried using the OpenAI Engine in guidance with a OpenAI compatible server from llama.cpp but was unable to do it (some tokenizer issues, and even more). This would be an amazing feature and makes integration with other projects easier.

AntoineBlanot avatar May 27 '24 05:05 AntoineBlanot