add support for anthropic, bedrock, azure, huggingface inference, togetherai, replicate, ai21, etc.
Hi @ajhai @vegito22,
Noticed you're calling the openai chat completions endpoint. I'm working on litellm (simple library to standardize LLM API Calls - https://github.com/BerriAI/litellm) and wanted to see how i could be helpful.
My PR adds support for new LLM providers (bedrock, togetherai, huggingface tgi, replicate, ai21, cohere, ai21 etc.) by replacing openai's ChatCompletion with litellm.completion.
Curious if you find this useful?
Happy to add additional tests/documentation if the initial PR looks good.
The latest updates on your projects. Learn more about Vercel for Git ↗︎
| Name | Status | Preview | Comments | Updated (UTC) |
|---|---|---|---|---|
| llmstack | ✅ Ready (Inspect) | Visit Preview | 💬 Add feedback | Oct 5, 2023 0:36am |
@krrishdholakia thanks for the PR. Would you mind adding litellm as a provider? We already support LocalAI and this can go parallel to LocalAI. See https://github.com/trypromptly/LLMStack/pull/11 for more.
@ajhai what's litellm missing for you to be comfortable replacing the openai sdk with this?