querybook
querybook copied to clipboard
Add support for Azure, OpenAI, Palm, Anthropic, Cohere, Replicate Models - using litellm
This PR adds support for models from all the above mentioned providers using https://github.com/BerriAI/litellm
TLDR: querybook gets:
- more models out of the box - Azure, OpenAI, Palm, Anthropic, Cohere, Replicate
- instant integration of models by modifying the
model
param (Easy to add models in the future, just change the model param) - log I/O to Posthog, Sentry, Helicone, Supabase without making any code changes & get API Key management
Here's a sample of how litellm completion is used:
from litellm import completion
## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# cohere call
response = completion("command-nightly", messages)
# anthropic call
response = completion(model="claude-instant-1", messages=messages)
@czgu can you please take a look at this PR when you get the chance ? Happy to add docs/testing if this initial commit looks good to you😊
@ishaan-jaff thanks for your PR! we'll have an update of the LLM feature soon, will revisit this PR after that.
@jczhong84 any update on this ?
@jczhong84 What was LiteLLM missing to be useful to you? Any feedback here would be helpful
@jczhong84 What was LiteLLM missing to be useful to you? Any feedback here would be helpful
Hi @ishaan-jaff , we've done some updates on the LLM feature, sorry to forget to get back to you.
As we're using Langchain, and I think it also supports LiteLLM. We're organizing those LLM providers as plugins, so you're welcome to add a new plugin for LiteLLM.