vanna
vanna copied to clipboard
Allow interacting with OpenAI-compatible API's
Is your feature request related to a problem? Please describe. Although OpenAI API access is supported, there is no direct way to edit the URL of another OpenAI-compatible service like LM Studio or Lite LLM. Doing this would greatly expand the interoperability of Vanna and even make several integrations obsolete (like Bedrock, Vertex, etc.) because something like Lite LLM can integrate with several different services and expose an OpenAI-compatible API where you just need to pass in the model name and URL
Describe the solution you'd like
Make a separate class based on the OpenAI_Chat
class that enables OpenAI integration, allow URL to be passed in as an environment variable and test that Vanna can still use the model compatible with the API. Call this a "generic OpenAI" class that will allow connecting to a local LM Studio or Lite LLM instance exposing an LLM. Ideally do the same with the embedding class to allow using a custom embedder also. This will allow for greater experimentation on which LLM + embedder combination allows for the greatest accuracy on your data.
Describe alternatives you've considered The only alternative to this I've considered is to use Vanna as an API service and calling the API as a tool from another workflow that does allow switching an embedder + LLM more flexibly (like Dify does).