torchss
torchss
> [@aksg87](https://github.com/aksg87) I have the openai plugin up and running: https://pypi.org/project/langextract-openai/ https://github.com/JustStas/langextract-openai > Does this mean I can now comfortably serve any LLMs from Huggingface using vLLM using this openai...
@JustStas This is phenomenal! If I use vLLM, SGLANG and llama.cpp to serve OpenAI-compatible API endpoints, are you suggesting I try out your [litellm](https://github.com/JustStas/langextract-litellm) over your [openai](https://github.com/JustStas/langextract-openai) ? If so:...